Yes, 4K — that’s 3840 x 2160 resolution — was all the rage at this year’s Consumer Electronics Show in Las Vegas.
Every leading flat panel manufacturer showcased 4K “Ultra HD” televisions at the show, including Sony, LG, Samsung, Toshiba, Sharp, Vizio and others. Sizes ranged from 50″ “entry-level” models to huge displays with 80″ and 100″ diagonals. Some even offered genuinely impressive “autostereoscopic” capability, which allows the viewing of 3D content without glasses. As you’d expect, they all looked fantastic. As you’d also expect, rumored prices for these high-resolution monsters were equally amazing — ranging from the modestly exorbitant to well over $20,000. While all the manufacturers claimed their consumer models will be “more affordable than expected,” few were willing to get specific, meaning that the Ultra HD 4K TV market will almost certainly remain limited to privileged super enthusiasts and professionals for the next few years, just as now old-fashioned HD was through the 1990s. Ironically though, high price is likely to be the least of the obstacles facing the adoption of 4K.
First, there’s the lack of content. It’s not that there isn’t 4K material out there. After all, the Hollywood studios all produce films in 4K — it’s become a standard resolution for directors who choose to shoot feature “films” digitally. You may have heard of the “RED EPIC,” which is actually the brand name of a 4K digital HD camera system that has been used to make dozens of Hollywood film and TV projects — everything from giant features such as The Hobbit: An Unexpected Journey and the upcoming Oblivion and Pacific Rim to TV series such as Justified and Southland. In fact, it seems as though almost every major new film is being shot in 4K these days and the final studio masters for each are completed and finished in 4K resolution. So what’s the problem? Try rampant concern in Hollywood over digital piracy. “Theoretically, 4K is the resolution you can get from a film print master,” notes Panasonic North America’s Chief Technology Officer, Eisuke Tsuyuzaki. “I don’t think the studios will be willing to give that up so easily. If they did, what’s the monetary value? What’s the business proposition?”
Think of it this way: If someone pirates a Blu-ray, they’ve got a great 1K — or 1920 x 1080 — high-definition copy. If someone pirates a 4K file, they’ve essentially got a duplicate of the studio master — not an appealing proposition for any senior studio executive.
But let’s set this issue aside for a moment. Sony is making no bones about its desire to sell lots of Ultra HD 4K displays and offer 4K content (presumably from its Sony Pictures Home Entertainment division). So how will it deliver that content securely? While limited 4K broadcasting is just getting underway in Japan, Korea and parts of Europe, the Federal Communications Commission (FCC) has yet to approve a 4K broadcast standard for the United States, and one isn’t expected to be ready until 2016 or 2017. Also, according to Sony’s resident Blu-ray guru, Victor Matsuda, “There is currently no activity within the BDA to bring 4K into the Blu-ray specs.” Nor is there any new disc-based format being developed to replace Blu-ray. That means 4K movies are likely going to be distributed digitally, streamed over broadband or satellite transmission.
Sony’s 4K service is expected to launch this summer. Though the company hasn’t specified how it will work, it seems that some kind of Internet-connected box will be involved, employing an encrypted digital stream to deliver movies from Sony Pictures’ servers.
Meanwhile, RED is also marketing a 4K consumer player called the REDRAY, which will work like a DVR to deliver 4K digital files (in 2D and 3D) to your display. RED says its movie files will have a data rate of around 20mbps, which is about the same as a Blu-ray, but that means file sizes for a typical two-hour movie will hit 15GB. Cable companies are already up in arms about making consumers pay the full cost of their Netflix streaming in paltry 480p and 1080p resolution, so imagine how they’ll react to bandwidth-hogging 3840p!
The problems for 4K don’t stop there. The market research firm IHS Screen Digest recently reported that digital projection had officially surpassed traditional film projection. Analog film projection represented just 37 percent of all theatrical projection around the world by mid-2012, with that number expected to decline to just 17 percent by 2015. Care to hazard a guess as to what the current resolution standard for digital projection in theaters is? Yep… 4K. Given how much money theater owners and distributors have just invested in converting their screens to digital, it’s hard to imagine they’re especially thrilled at the prospect of having to compete with 4K in the home.
Yet another deal-breaker for 4K is the fact that many consumers will simply be unable to differentiate 4K content from current HD video at typical home display sizes. You’ll recall that lots of people had trouble telling Blu-ray content from DVD even in side-by-side comparisons, and there are many viewers to this day who own HDTVs but haven’t hooked them up properly. The result is that they’re not even watching true HD content on their expensive new TVs, but they don’t know it and can’t tell the difference anyway. Looking ahead, there’s research that suggests that in order to fully appreciate Ultra HD 4K, you either have to be sitting very close to your TV or have a very large display — something well over 60″. Neither of those alternatives is likely to be practical for most home viewers anytime soon.
But perhaps the biggest obstacle of all for Ultra HD 4K is the fact that HDTV is simply good enough for the vast majority of viewers, most of whom have only just upgraded to HD. In addition, most broadcasters have only recently completed the capital upgrades required to deliver HD content, so now they’re hoping to recoup their investments. And we tend to forget just how long we all lived with analog TV sets. Limited TV broadcasting in the U.S. began way back in 1929 but didn’t go nationwide until the FCC made NTSC the official analog standard in 1941, first in black-and-white and then (a decade later) in color. And so life continued for another 68 years… until analog broadcasting finally ended in the U.S. on June 12, 2009. That’s a long time to wait for digital and high-def, even though the first HD demonstrations appeared as early as 1987. Why did the HD transition take so long? Because, for most people, analog was good enough.
So what of Ultra HD 4K? Will there ever be a market for that much resolution in the home? “Time will tell,” says Panasonic’s Tsuyuzaki. “Keep in mind, people are already working on 8K. But I think 3D, IPTV [TV content delivered via Internet protocol rather than broadcast, cable or satellite] and mobile are the three areas where display technology is really going to evolve in the next few years.” Translation: Forget 4K for now. Sit back, relax and enjoy your new HD flat screen. It’s going to be around a long while.