Monday, 10 March 2014

What is 4K UHD? Next-generation resolution explained



From the World Cup to Netflix, in 2014 you're going to start hearing a lot more about 4K resolution or 'Ultra HD.' But what is it? And more importantly, do you want it?

As if 3D TV and LED LCD vs. OLED vs. plasma and 120Hz and the Soap Opera Effect weren't confusing enough, in the last year we have seen the rise of a new HDTV technology called 4K. Or if you use its official name, Ultra High Definition (UHD).

UHD is an "umbrella term" that encompasses higher resolutions (more pixels) than HDTV, as well as more realistic color and higher frame rates. Today and this year, pretty much the only one of those improvements available in new TVs and content is 4K resolution, so that's what we'll talk about here. Judging fromthe new TVs shown at CES 2014, manufacturers are tripping over themselves to bring you a new array of 4K compatible products.

But just like 3D and HD before it, 4K has a case of putting the hardware chicken before the software egg. About 15 months after 4K TVs first appeared on the market , there's little consumer 4K content available: no TV channels or Blu-ray discs, just a few specialized video players, YouTube and other clips of varying quality, and promises of streaming video.

Still, the shift from 1080p to 4K TV hardware is inevitable. This year 4K TVs will replace high-end 1080p models as the best-performing LED LCD-based sets on the market -- although the reason they're better will have nothing to do with resolution.

Confused again? Don't worry, we'll walk you through, starting with the question: So what is 4K anyway, and what makes it different from high definition?

What's in a name? '4K' versus 'UHD'

In August 2012, the Consumer Electronics Association introduced the term Ultra High Definition, partly defined as resolutions of "at least 3,840x2,160 pixels". The idea was to replace the term 4K. The CEA's name lasted less than a day, as Sony then announced it was going to call the technology "4K Ultra High Definition". This is the term now used by most other TV manufacturers too, who seem interested in covering all the buzzword bases at the expense of brevity.

In practice, you will often see UHD used interchangeably with 4K, whether describing TVs, source devices, accessories or content. We at CNET say "4K" instead of "UHD" almost exclusively, and our readers and Google strongly agree.

Digital resolutions: A primer

The latest in a line of broadcast and media resolutions, 4K is due to replace 1080p as the highest-resolution signal available for in-home movies and television.

With the arrival of 4K there are four main resolution standards for use in the home: standard definition (480p/540p) , high definition (720p), full high definition (1080i/p) and ultra high definition (2160p).

Sharp's UD 4K television

Sharp

When used in a home context, 4K/UHD means the TV's screen has a minimum resolution of 3,840 pixels wide and 2,160 pixels high, making it the equivalent to two 1080p screens in height and two in length. This resolution was originally known as "Quad HD," and it's used by basically every 4K TV.

Another resolution, known as 4Kx2K (4,096x2,160 pixels), is used by someprojectors and many professional cameras. It also falls under the umbrella of 4K/UHD. Other shooting resolutions are also employed in the pro realm, depending on the camera.

Four resolutions compared: standard definition; full high definition; and the two kinds of ultra high definition (Quad HD and 4Kx2K).

CNET

While 4K is relatively new, high definition (HD) itself has been with us for about a decade, and is the format used in Blu-ray movies and HD broadcasts. There are three main versions of HD: full high definition 1080p (progressive), 1080i (interlaced), and 720p (also called simply "high definition").

Despite the existence of HD and 4K, many television programs, online videos and all DVDs are still presented in standard definition, loosely defined as 480 lines. Standard definition began life as NTSC TV broadcasts before switching to digital with the introduction of ATSC in 2007.

The beginnings of digital cinema and 4K

While it's currently being touted as a new broadcast and streaming resolution--particularly with the appearance of theHEVC H.256 codec--the roots of 4K are in the theater.

When George Lucas was preparing to make his long-promised prequels to the "Star Wars" movies in the late '90s, he was experimenting with new digital formats as a replacement for film. Film stock is incredibly expensive to produce, transport, and store. If movie houses could simply download a digital movie file and display it on a digital projector, the industry could save a lot of money. In a time when cinemas are under siege from on-demand cable services and streaming video, cost-cutting helps to keep them competitive.

After shooting "The Phantom Menace" partly in HD, George Lucas shot "Attack of the Clones" fully digitally in 1080p. This was great for the future Blu-ray release, but the boffins soon found that 1080p wasn't high-enough resolution for giant theater screens. If you sit in the front rows and watch 1080p content, you may see a softer image or even pixel structure, which can be quite distracting.

The industry needed a resolution that would work if the audience were sitting closer than the optimum "one-and-a-half times the screen height", and found it required that resolution to be higher than 1080p. The Digital Cinema Initiatives (DCI) was formed in 2002 with the goal of setting a digital standard and based on these efforts, two new resolutions came about: a 2K specification, and later in 2005, the 4K format.

The first high-profile 4K cinema release was "Blade Runner: The Final Cut" in 2007, a new cut and print of the 1982 masterpiece. Unfortunately, at that time very few theaters were able to show it in its full resolution. It would take one of director Ridley Scott's contemporaries to truly drive 4K into your local cineplex.

Director James Cameron arrives at the premiere of "Avatar" on Dec. 16, 2009, in Los Angeles.

ROBYN BECK/AFP/Getty Images

How 3D drove the takeup of 4K

Do you remember seeing James Cameron's "Avatar 3D" in the theater? Cameron's movie about "giant blue dudes" helped drive high-resolution 4K Sony projectors into theaters around the world. Movie studios keen to maintain that momentum then released a slew of 3D films -- mostly converted from 2D -- and continued the expansion of 4K cinemas. While 3D has declined in popularity, 4K movies are here to stay.

The industry has been quick to distance itself from 3D, and has taken care not to make the same mistakes by equating 4K with 3D. But there are obvious benefits for 3D on a 4K TV screen. In our extended hands-on with the Sony XBR-84X900, we saw the best 3D TV we'd ever tested. It delivered the comfort and lack-of-crosstalk benefits of passive 3D, while delivering enough resolution (1080p to each eye) to be free of the interlacing and line structure artifacts inherent in 1080p passive TVs. Higher resolutions like 4K are also necessary for new implementations of glasses-free 3D TVs.

From theater to the home

While 4K resolution makes perfect sense for huge theatrical screens, its benefits are less visible on TVs at home, watched from normal seating distances.

"4K is at the point of diminishing returns." --Dr. Dave Lamb of 3M Laboratories

"There was a huge, noticeable leap from standard definition to HD, but the difference between 1080p and 4K is not as marked," said researcher Dave Lamb of 3M Laboratories. He added that "4K is at the point of diminishing returns," but said there could be some benefits for screens over 55 inches.

At CNET, we've spilled a lot of ink comparing 4K TVs to 1080p versions, for example our reviews of the Samsung UN65F9000 and Panasonic TC-L65WT600. Everything we've seen so far reinforces the notion that a 4K resolution TV, seen from a normal seating distance, doesn't provide a significant visible benefit to the 1080p sources available today, nor a major bump in picture quality with the limited 4K sources we've tried.

The math of visual acuity backs up our observations. Check out Four 4K TV facts you must know for details.

Are 4K televisions future-proof?

Most of the companies which sell TVs in the US market--both major and minor--have committed to releasing 4K displays in 2014, with most of them existing as "premium" offerings. At CES 2014, there were dozens of new 4K screens on offer from entry-level 50-inchers to the ridiculously opulent 105-inch curved TVs from LG and Samsung. But compared to 1080p TVs they still demand a premium.

One feature that many of the announced screens were touting was compatibility with HDMI 2.0. One of the many benefits of the new standard is that it will enable a higher data rate than HDMI 1.4, the current standard. Why is this important? Full 4K content at 60 frames per second.

While there is no content yet that can take advantage of this higher 4K frame rate -- apart from PC games -- we consider HDMI 2.0 a must-have for any 4K TV going forward. In addition to including it in their 2014 4K TVs, many makers also offer an upgrade path that enables their 2013 4K TVs to be upgraded to HDMI 2.0.

4K content in the home starts with streaming


In July 2012, Timescapes by filmmaker Tom Lowe became the first full-length 4K movie available to buy and download, but it won't be the last.

At CES 2014, several companies --Samsung/AmazonSony/Netflix -- announced plans for streaming 4K content to compatible TVs this year. But this isn't the first time 4K video content has been available for streaming: in July 2010 YouTube premiered its 4K channel and the nature of the content varies -- as the service generally does -- from sweeping vistas of New Zealand to twerking Stormtroopers.

While there are currently no cable boxes that will support 4K in the US, the industry is gearing up for a new broadcast standard which is promised to deliver 4K resolutions. Called HEVC or H.265, this new codec is seen by manufacturers and broadcasters as a way to deliver compressed 4K content economically.

While receiving HEVC-compressed TV channels will certainly mean a new cable box, nearly every 4K TV announced at CES 2014 will also include decoding support for the HEVC standard.

At Sony's CES 2014 it announced that it was going to shoot the World Cup in 4K. It's unclear at this point how this content will be distributed, though live screenings at cinemas are very likely.

In mid-2013, Sony was also one of the first companies to release a 4K player in the form of the FMP-X1, a proprietary media server that could only be used with Sony 4K televisions.

The Redray player plays 4K movies shot on Red cameras

Red

Meanwhile, movie camera maker Red announced its own Redray player last year which plays movie in the proprietary RED format. As an adjunct to this it announced a partnership with the Odemax Web sitefor consumers to download compatible 4K films.

In January 2014, the Blu-ray Disc Association officially announced it was working on 4K Blu-ray for release by the end of 2014.

Speaking in early 2013, Tim Alessi, director of home electronics development at LG, said he believed that such a development would boost interest in the technology. "I do expect that at some point [4K] will be added [to the Blu-ray specification]. Having that content in the home is what the average consumer will want," Alessi said.

In the absence of substantial 4K content, 4K TVs will upconvert 1080p or even standard-definition content to display on their 4K screens. To this end, makers likeSony and Oppo offer players that will upscale Blu-ray to 4K. All 4K televisions include a 4K upscaler onboard as well.

Finally, just when we thought we had it all covered, 4K may not even be the final word in resolution. Japanese broadcaster NHK was the first to demonstrate 8K in 2008, and at CES 2014 there were industry murmurings -- and prototypes -- devoted to higher-than-4K resolution.

Conclusion

Will the extra resolution offered by 4K make movies better? You could argue that it depends on the format of the original film. For example, "The Blair Witch Project" and "28 Days Later" were both shot with standard-definition camcorders, and there would arguably be little extra benefit to buying either movie in a 4K native format over a DVD -- depending on the quality of the scaler in your brand-new 4K screen, of course.

Even with reference-quality native 4K material, however, a 4K-resolution TV or projector won't provide nearly the visible improvement over a standard 1080p model that going from standard-def to high-def did. To appreciate it you'll have to have sit quite close to a large screen -- sort of like being in the front few rows of a movie theater.

But whether it's 4K or 8K, you can bet that manufacturers haven't run out of cards when it comes to marketing the next "must-have" feature in the coming crops of televisions.

No comments:

Post a Comment