The breakout hit of the Consumer Electronics Show in 2010 was a television set. Hard to believe now, maybe, but it’s true; for one shining moment, the Toshiba Cell TV was the most exciting new thing in tech. Its name invoked the overkill processors inside. It was one of the first sets to promise “Net TV Channels” that would let you stream directly from Netflix or Pandora. And it could show pictures in three dimensions.
The Cell TV was hardly the lone 3D TV at CES 2010. Sony, Panasonic, LG, Samsung; everyone brought their spin on the decade’s big new breakthrough. Each of them touted the benefits of putting on a pair of clunky tinted glasses before settling in to a night on the couch, and presented as self-evident that their customers would clamor for the opportunity.
The technology had existed before; Samsung got there first, in 2007. But January 2010 presented a clear inflection point. In addition to the Cell TV there were 3D Blu-ray players, sets that could automatically give depth to flat images, and the promise of DirecTV networks that broadcast exclusively in three dimensions. The industry had lined up behind a vision of the future, marketing executives and product managers insisting that the more they had created was also better. How could it not be? It was more.
Five years later, 3D TV was dead. You probably haven’t thought about it since then, if you even did before. But there’s maybe no better totem for the last decade of consumer technology. (The iPhone was more transformative, but is also singular, and besides that was born in the late aughts.) It’s what happens when smart people run out of ideas, the last gasp before aspiration gives way to commoditization. It was the dawn of all-internet everything, and all the privacy violations inherent in that. And it steadfastly ignored how human beings actually use technology, because doing so meant companies could charge more for it.
What I remember most from those press conferences in 2010 was the assuredness that millions of people somehow actively wanted to have to put glasses on their faces in order to watch television. Even then, it made no sense. TV viewing has always been a large passive experience, something to do while you’re doing other things. And besides that, only certain types of shows—movies, maybe some sports—actually benefited from 3D in the first place. Or would, if the television sets were any good; most of the early ones stuttered and flickered even when you sat dead center in front of them. Stray a few feet to either side, and the viewing angle shot the experience altogether.
It gets worse. Different manufacturers backed different 3D TV formats and technologies, meaning one set of glasses wouldn’t necessarily work on a competitor’s set. The simple act of watching in 3D caused eye strain in a significant chunk of the population. And the list of available things to watch never hit critical mass.
Lots of technology is bad at the start, but look closely at this one. The pointless 3D TV standards war presages the manifold sins of the smart home. Its fundamental lack of justification for existence—other than selling more stuff—has clear kinship with everything from Google Glass to Amazon Dash buttons to Snap Spectacles. (Related: the dogged determination that people will endure face accessories, still hustled by Oculus and HTC Vive and Magic Leap and other self-deceivers.) This is admittedly more of a stretch, but if you squint you can see a thinly drawn line between a viewing experience that’s actively bad for your eyes and hoverboards that won’t stop exploding.