I swear I've been here before. Not here, as in here at CES, where I spent the week checking my product assumptions against the actual offerings arrayed on the showfloor. But here, as in at a crucial moment in time when a single industry rushes to push a massively expensive, relatively unnecessary technology on unsuspecting consumers. That's the case with Ultra HD at CES 2013. Formerly known as 4k TV (because of the rough number of horizontal pixels employed in the technology) and now already truncated to UHD by company reps on the floor and in the hallways, Ultra HD is supposed to be the next thing every consumer will want.

It ain't gonna happen. The reasons evoke a ready comparison to 3DTV. And indeed, I have been here before, back at CES 2010 where I wrote a piece called 3DTV at CES: Poking Holes in the Hype. That year, some industry thinkers had conducted a survey and concluded that as many as 5 million consumers were ready to jump into 3D with both feet while opening their big, fat wallets. So I wrote the obligatory post that said, pointedly, no. 

The comparison between 3D and Ultra HD is obvious. They were both too expensive at introduction (Ultra HD much more so than even 3D); they both suffered from a dearth of content availability; they both required a complete retooling of the equipment used by video production teams and film studios; and they both landed at a time when consumers were pretty happy with the awesomely large, cheap TV screens they already had. 

But the most important comparison is this one: Both 3D and Ultra HD were offered under the assumption that people wanted to move to some fanciful "next level" of video technology. And why not? Hadn't consumers just proven that they were susceptible to the siren song of bigger screens and higher resolution when they upgraded to HD from standard definition screens over the prior decade? This was exactly the rationale that the chairman of Panasonic offered when we sat down at CES in 2010 to discuss the company's investment in 3D TVs, 3D cameras, and 3D production equipment. 

If that was the mistake of 3D, it was also the mistake of Blu-ray. In both these cases, manufacturers pushed on a new technology under the mistaken assumption that people were antsy to have the highest quality video experience possible. But as we learned from Blu-ray's tepid response and 3Ds failure, people were already at a level of technology that was good enough. The principle of good enough is a critical one taught by Clayton Christensen as part of his disruptive innovation research: A technology needs to be just good enough to give people solutions to their problems, answers to their questions, and fulfillment to their fundamental needs. Disruptors know this and act accordingly. Digital disruptors go one step farther, acting on this knowledge with unprecedented speed. As I have written in my upcoming book, Digital Disruption, in a digital era, fulfilling product experiences depend much more on the digital experience surrounding the product than on the product's traditional specs or boundaries. 

The lesson for TV makers is clear: Digital tools and digital platforms — not to mention the digital devices that people have in their laps while they watch TV — make it possible to deliver novel, engaging video experiences without changing the hardware much, if at all. What will fulfill viewers is not increased quality, but increased everything else. They want more access to more content in more places in more ways on more devices than they had before. That's why people are happy to watch streaming video on Netflix on a 4-inch phone the same day that they catch the season opener of Downton Abbey through their cable box on a 65-inch wall-mounted HD screen. What matters most is not the number of pixels or the quality of the pixels themselves (take note OLED fans), but the increasing convenience of the content's discovery and delivery. This is why TV makers should be investing in a better experience rather than a bigger one. That means investing in a better UI, better apps for the TV and its companion screens, and it may even justify investing in better chips (as Samsung has done) to power those experiences.