Panasonic has published a lifespan statistic for its “full” high definition (1080p) plasma TVs: on average they’ll last at least 42 years before the brightness of the display degrades to less than 50%.
That’s based on an average 6.5 hours viewing every single day – or 100,000 hours in total.
Even its 720p high definition plasma TVs will last for 60,000 hours (around 25 years).
It’s an interesting statistic to highlight, in a fast-paced, needed-to-be-replaced-last-week technology culture, and though I don’t know the exact statistic for people replacing their TVs is, I bet it’s closer to 5-10 years than 42.
A big reason for this is not (usually) that the sets don’t work; it’s that they gradually slip into obsolescence.
Yes, cathode ray tubes burn out, and bulbs fail, but often the choice to buy a new TV is because the technology has moved on enough that the consumer’s viewing experience is, allegedly, improved by buying a new set.
Take high definition TV: If you want it, it immediately renders your analogue TV obsolete.
But who really believes that 1080p HD, considered fairly cutting edge for the average consumer now, will be the gold standard in even 5 years’ time?
Scientists are already working on and demonstrating much higher definition systems. What happens when 1440p, 2160p, and above become standard? Yes, I know we don’t have the broadcast or communication infrastructure to handle those formats yet, but it’s unlikely to take four decades to achieve.
When telly technology moves on only a little, current standards such as HDMI are going to be about as useful as Scart is to current high definition. What’s the point of having a TV that still works, but can’t receive any signals?
It would also be interesting to note whether the degradation of plasma TV’s brightness is linear or not. If it is, then in five years time, the plasma screen could have lost nearly 6% of its brightness output.
I suppose in 2049 these Panny plasmas will make an interesting showpiece in a working museum of defunct technology.