An interesting feature has turned up in NVIDIA’s latest drivers: the ability to drive certain displays over HDMI at 4K@60Hz. This is a feat that would typically require HDMI 2.0 – a feature not available in any GPU shipping thus far – so to say it’s unexpected is a bit of an understatement. However as it turns out the situation is not quite cut & dry as it first appears, so there is a notable catch.

First discovered by users, including AT Forums user saeedkunna, when Kepler based video cards using NVIDIA’s R340 drivers are paired up with very recent 4K TVs, they gain the ability to output to those displays at 4K@60Hz over HDMI 1.4. These setups were previously limited to 4K@30Hz due to HDMI bandwidth availability, and while those limitations haven’t gone anywhere, TV manufacturers and now NVIDIA have implemented an interesting workaround for these limitations that teeters between clever and awful.

Lacking the available bandwidth to fully support 4K@60Hz until the arrival of HDMI 2.0, the latest crop of 4K TVs such as the Sony XBR 55X900A and Samsung UE40HU6900 have implemented what amounts to a lower image quality mode that allows for a 4K@60Hz signal to fit within HDMI 1.4’s 8.16Gbps bandwidth limit. To accomplish this, manufacturers are making use of chroma subsampling to reduce the amount of chroma (color) data that needs to be transmitted, thereby freeing up enough bandwidth to increase the image resolution from 1080p to 4K.

An example of a current generation 4K TV: Sony's XBR 55X900A

Specifically, manufacturers are making use of Y'CbCr 4:2:0 subsampling, a lower quality sampling mode that requires ¼ the color information of regular Y'CbCr 4:4:4 sampling or RGB sampling. By using this sampling mode manufacturers are able to transmit an image that utilizes full resolution luma (brightness) but a fraction of the chroma resolution, allowing manufacturers to achieve the necessary bandwidth savings.

Wikipedia: diagram on chroma subsampling

The use of chroma subsampling is as old as color television itself, however the use of it in this fashion is uncommon. Most HDMI PC-to-TV setups to date use RGB or 4:4:4 sampling, both of which are full resolution and functionally lossless. 4:2:0 sampling on the other hand is not normally used for the last stage of transmission between source and sink devices – in fact HDMI didn’t even officially support it until recently – and is instead used in the storage of source material itself, be it Blu-Ray discs, TV broadcasts, or streaming videos.

Perceptually 4:2:0 is an efficient way to throw out unnecessary data, making it a good way to pack video, but at the end of the day it’s still ¼ the color information of a full resolution image. Since video sources are already 4:2:0 this ends up being a clever way to transmit video to a TV, as at the most basic level a higher quality mode would be redundant (post-processing aside). But while this works well for video it also only works well for video; for desktop workloads it significantly degrades the image as the color information needed to drive subpixel-accurate text and GUIs is lost.

In any case, with 4:2:0 4K TVs already on the market, NVIDIA has confirmed that they are enabling 4:2:0 4K output on Kepler cards with their R340 drivers. What this means is that Kepler cards can drive 4:2:0 4K TVs at 60Hz today, but they are doing so in a manner that’s only useful for video. For HTPCs this ends up being a good compromise and as far as we can gather this is a clever move on NVIDIA’s part. But for anyone who is seeing the news of NVIDIA supporting 4K@60Hz over HDMI and hoping to use a TV as a desktop monitor, this will still come up short. Until the next generation of video cards and TVs hit the market with full HDMI 2.0 support (4:4:4 and/or RGB), DisplayPort 1.2 will remain the only way to transmit a full resolution 4K image.

Comments Locked


View All Comments

  • xdrol - Friday, June 20, 2014 - link

    You mean to the cost of a device. To the price, it would add the same in dollars.
  • dragonsqrrl - Friday, June 20, 2014 - link

    lol, yep pretty much.
  • willis936 - Saturday, June 21, 2014 - link

    How is DP phy superior to HDMI?
  • Darkstone - Sunday, June 22, 2014 - link

    In every way except display-side hardware complexity.

    HDMI requires licensing costs ($10k + 4 cent per device IF you market the fact your device supports HDMI and your device supports all anti-piracy measures), where DP is free.
    HDMI supports only fixed refresh rates, where DP support any refresh rate, even variable refresh rates.
    Display-size buffering, used some mobile phones, is not possible with HDMI's fixed refresh rate.
    DP supports daisy-chaining, although often not implemented.
    HDMI's standardization process is awful. According to wikipedia, HDMI 1.0 supports 1200p. But on at least one ivy-bridge laptop the maximum supported resolution is 1080p. HDMI 1.3 is also supposed to support to support 1440, but only two ivy bridge laptop supports that feature: the Alienware M17x with AMD GPU and M18x. This situation has increased somewhat with haswell/kepler but not much.
    DP is physically smaller, it is possible to use mini-hdmi but that is usually limited to 1080p and requires an adapter anyway.
    HDMI provides no support for high bit depth in most implementations.
  • willis936 - Monday, June 23, 2014 - link

    Very few of those points have anything to do with phy. Refresh rates and display buffers are closer to protocol than phy. I just don't underrated why someone would claim that dp phy is somehow superior to hdmi like the people writing the spec didn't know what they were doing.
  • leliel - Friday, June 20, 2014 - link

    As long as they give you a choice between low-hertz high-chroma and high-hertz low-chroma, this is a perfectly cromulent solution. Anyone buying a 4K setup at this juncture ought to be well aware of the limitations already, and the workaround is more than anyone had the right to expect.
  • thewhat - Friday, June 20, 2014 - link

    Here's an (extreme) example of how 4:2:0 affects the image quality:

    BluRay and most currently used "consumer" videos are stuck at 4:2:0 anyway. Hopefully this will change in future, with the wider adoption of new video formats.
  • JlHADJOE - Sunday, June 22, 2014 - link

    Well that's rather disappointing.

    Most TV/movie content is only 24-30fps so there's really no disadvantage with the current 30Hz limit, and going to 4:2:0 pretty much compromises usability as a monitor, making 60Hz rather pointless.
  • bernstein - Friday, June 20, 2014 - link

    as per samsungs spec sheet the ue40hu6900 is HDMI 2.0 compliant. yet you are implying it does not! so do you know this for a fact? thanks.
  • mczak - Friday, June 20, 2014 - link

    That's the same crap as it always was with HDMI: you are not required to support the new high-bandwidth modes to claim HDMI 2.0 compliance (in fact this makes even sense if it's not a UHD device). So these devices can claim HDMI 2.0 compliance yet miss the only truly interesting feature of it. Well ok not quite they at least support the new HDMI 2.0 ycbcr 4:2:0 feature hence make it possible to support the full resolution with ycbcr 4:2:0 format.

Log in

Don't have an account? Sign up now