HDMI has supported uncompressed surround for a decade. Were you referring to something else?
I don't mind HDMI's domination, honestly. The industry as a whole has largely ignored DP for some reason (cable and connector cost/complexity?). As of right now on Newegg, more monitors have DVI or HDMI than have DP. Add televisions to the mix and HDMI is the clear leader for all displays. I'm really glad the HDMI folks have embraced so many gaming-centric features as well as so many resolution/refresh options, it is literally a game-changer.
Only question on my mind - will NVIDIA embrace HDMI VRR or will they expect license fees in order to make them "G-Sync compliant"?
As was pointed out below, HDMI can't send uncompressed audio and video simultaneously.
As far as your HDMI domination argument, you really are just bending over and taking it with that mentality. You don't mind paying more for an inferior product? Because saying you don't mind HDMI's "domination" is exactly that. The cost of implementation is passed down to the consumer. You. When you could have just had DisplayPort all along, for free, with more advanced capabilities.
Most, if not all, business PC's (laptops, workstations) have DP. Most, if not all video cards have multiple DP's, usually one HDMI. Professional video cards have just DP.
Probably more about the TV industry people wanting to have more control of the standard so they can have an easier time adding TV centric features that the computer world can ignore as not needed for our uses.
With HDMI dropping the use of a separate clock for a 4th data line and self clocking signals HDMI has moved a lot closer to how displayport is implemented since this was by a large margin the biggest functional difference between the two. It'll be interesting to see if a future displayport standard ups their clockspeed from 800 to 1200 mhz to equal HDMI 2.1; or if they go higher to reclaim the fastest title.
I'd rather DisplayPort went away. TV's are never going to adopt DisplayPort. Nor will video game consoles, streaming boxes, and A/V Receivers. DisplayPort is odd ball connector.
Dolby Vision has supported dynamic metadata since the start. I realize it's not part of the core HDMI spec, but the fact that it exists means that even HDMI 1.4, with the correct supporting equipment, supports dynamic HDR.
So, when HDMI 2.1 adds support for "dynamic HDR", what does that mean specifically? HDR10? HLG as well? Again, please be specific.
From the FAQ: Q: Which HDR formats does the specification support? A: It supports various static and dynamic HDR solutions in the market
From the slide deck: The HDMI 2.1 specification supports multiple static and dynamic HDR solutions
It is not possible to be more specific as we do not have access to the official specification, which is limited to adopting (and paying) members of the HDMI Licensing Administrator.
At the bitrates involved the de/compression is almost certainly going to be done via ASICs on both ends.
Calling it a 48 Gb/s cable is because it only carries 48GB/s of data. What you're suggesting is the equivalent of a DSL ISP claiming to offer gigabit service based on the several hundred to 1 compression ratios that modern video codecs can achieve.
I'm lost on the eARC stuff-- "In real world terms, this is meant to better allow HDMI to work without a dedicated audio receiver."? In what configuration could be able to send audio back over the same cable video comes in, help eliminate a receiver? Am I not thinking about it right?
That sentence was referring to use with sound bars. At least as important, eARC is required for internal TV apps to be able to send anything better than lossy 5.1 (or uncompressed 2.0) audio to any external audio device whatsoever.
What I'd really like to see in connection with this is a newer digital audio interconnect for this kind of use. eARC is great for a true receiver with video sources connected directly to it, but it's a bit of a waste of an HDMI port if you're just using it with a sound bar.
I think letting you use the TV as the hub everything is plugged into instead of a receiver is probably more likely to pay off by letting you not have to replace your current HDMI 2.0 receiver to connect your HDMI 2.1 TV to an HDMI 2.1 source.
"ALLM looks at optimizing latency settings for a wide range of applications and recognizing particularly latency sensitive ones. Details were very sparse, but the given description seems to imply some unmentioned drawback or cost, as otherwise low latency settings would be enabled at all times."
I assume the cost is that this will effectively enable game mode on the TV). HDMI itself should only introduce about 1 frame of latency, as this is how long it takes to transmit one frame of data. (I gather QFT changes this. Also, of course, the display doesn't actually need to wait until it has an entire frame buffered before beginning to display it, but this is how most TVs operate in practice.) Any additional delay has always been introduced on either the sending or, more commonly, the receiving side.
On an unrelated note, I hope that just maybe, with HDMI 2.1, they take the CEC spec and/or certification seriously. HDMI CEC is a great idea in theory, but in practice due to what I understand to be a somewhat vague spec and lack of strict certification, it's a complete mess. I've had to disable it (frequently due to one device or another trying to turn on and take control of the TV/receiver when an entirely different device was turned on) in more instances than it's actually worked at all, and the direct control and navigation it theoretically allows of source devices using the TV/receiver remote has simply never worked.
My FiOS boxes and Xbox do not implement it, everything else in my AV stack does. It is always so annoying have to switch the receiver input 3 times before it sticks due to their lack of CEC support.
The excuse they give for no support - not everyone does it in a compatible manner so we just don't even try. Nevermind the fact I have never had 2 incompatible pieces of CEC equipment across multiple vendors.
The addition of variable refresh to the official HDMI standard means that the benefits of the technology can now be brought to a much wider range of products and content.
I wonder if this will lead to GeForce cards being able to use both FreeSync and G-sync? I certainty hope so.
Currently you buy a card and a matching monitor, and then when it's graphics card upgrade time you buy another card from the same manufacturer to go with the existing screen. Next thing you know the screen is up for an upgrade and you get the same screen tech to go with the existing graphics card... It's just one big loop.
If one graphics card company would break out of that loop, people could buy that company's card regardless of what screen they already had.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
23 Comments
Back to Article
CoreyWat - Wednesday, November 29, 2017 - link
Really wish HDMI would just use the USB Type C Connector already.xenol - Wednesday, November 29, 2017 - link
I'd rather HDMI go away in favor of DP.But I guess the allure of licensing fees is too much.
Samus - Wednesday, November 29, 2017 - link
Ditto.But at least HDMI has caught up to uncompressed 5.1/7.1 lol
nathanddrews - Wednesday, November 29, 2017 - link
HDMI has supported uncompressed surround for a decade. Were you referring to something else?I don't mind HDMI's domination, honestly. The industry as a whole has largely ignored DP for some reason (cable and connector cost/complexity?). As of right now on Newegg, more monitors have DVI or HDMI than have DP. Add televisions to the mix and HDMI is the clear leader for all displays. I'm really glad the HDMI folks have embraced so many gaming-centric features as well as so many resolution/refresh options, it is literally a game-changer.
Only question on my mind - will NVIDIA embrace HDMI VRR or will they expect license fees in order to make them "G-Sync compliant"?
Samus - Wednesday, November 29, 2017 - link
As was pointed out below, HDMI can't send uncompressed audio and video simultaneously.As far as your HDMI domination argument, you really are just bending over and taking it with that mentality. You don't mind paying more for an inferior product? Because saying you don't mind HDMI's "domination" is exactly that. The cost of implementation is passed down to the consumer. You. When you could have just had DisplayPort all along, for free, with more advanced capabilities.
Most, if not all, business PC's (laptops, workstations) have DP. Most, if not all video cards have multiple DP's, usually one HDMI. Professional video cards have just DP.
Take a guess why.
DanNeely - Wednesday, November 29, 2017 - link
Probably more about the TV industry people wanting to have more control of the standard so they can have an easier time adding TV centric features that the computer world can ignore as not needed for our uses.With HDMI dropping the use of a separate clock for a 4th data line and self clocking signals HDMI has moved a lot closer to how displayport is implemented since this was by a large margin the biggest functional difference between the two. It'll be interesting to see if a future displayport standard ups their clockspeed from 800 to 1200 mhz to equal HDMI 2.1; or if they go higher to reclaim the fastest title.
TEAMSWITCHER - Wednesday, November 29, 2017 - link
I'd rather DisplayPort went away. TV's are never going to adopt DisplayPort. Nor will video game consoles, streaming boxes, and A/V Receivers. DisplayPort is odd ball connector.MTEK - Wednesday, November 29, 2017 - link
Any news on the HDCP front? Are they going to come out with a new version sometime after 8K/HDMI 2.1 equipment is sold?chaos215bar2 - Wednesday, November 29, 2017 - link
You need to be more specific when discussing HDR.Dolby Vision has supported dynamic metadata since the start. I realize it's not part of the core HDMI spec, but the fact that it exists means that even HDMI 1.4, with the correct supporting equipment, supports dynamic HDR.
So, when HDMI 2.1 adds support for "dynamic HDR", what does that mean specifically? HDR10? HLG as well? Again, please be specific.
Nate Oh - Wednesday, November 29, 2017 - link
From the FAQ:Q: Which HDR formats does the specification support?
A: It supports various static and dynamic HDR solutions in the market
From the slide deck:
The HDMI 2.1 specification supports multiple static and dynamic HDR solutions
It is not possible to be more specific as we do not have access to the official specification, which is limited to adopting (and paying) members of the HDMI Licensing Administrator.
chaos215bar2 - Wednesday, November 29, 2017 - link
Fair enough. It's odd how difficult it is to find specific information on HDMI features and functionality.nandnandnand - Wednesday, November 29, 2017 - link
What is the downside of using Display Stream Compression? Is it more computationally expensive for the host system?Otherwise they should just call it a 128 Gb/s (42.66666 × 3) cable instead of 48 Gb/s.
DanNeely - Wednesday, November 29, 2017 - link
At the bitrates involved the de/compression is almost certainly going to be done via ASICs on both ends.Calling it a 48 Gb/s cable is because it only carries 48GB/s of data. What you're suggesting is the equivalent of a DSL ISP claiming to offer gigabit service based on the several hundred to 1 compression ratios that modern video codecs can achieve.
extide - Wednesday, November 29, 2017 - link
It's probably perceptually lossless but technically lossy, just like DSC in DP. It's probably even the exact same algorithm.Ryan Smith - Wednesday, November 29, 2017 - link
Correct. They're using a 3:1 fixed ratio lossy compression mode.Embiggens - Wednesday, November 29, 2017 - link
I'm lost on the eARC stuff-- "In real world terms, this is meant to better allow HDMI to work without a dedicated audio receiver."? In what configuration could be able to send audio back over the same cable video comes in, help eliminate a receiver? Am I not thinking about it right?chaos215bar2 - Wednesday, November 29, 2017 - link
That sentence was referring to use with sound bars. At least as important, eARC is required for internal TV apps to be able to send anything better than lossy 5.1 (or uncompressed 2.0) audio to any external audio device whatsoever.What I'd really like to see in connection with this is a newer digital audio interconnect for this kind of use. eARC is great for a true receiver with video sources connected directly to it, but it's a bit of a waste of an HDMI port if you're just using it with a sound bar.
DanNeely - Wednesday, November 29, 2017 - link
I think letting you use the TV as the hub everything is plugged into instead of a receiver is probably more likely to pay off by letting you not have to replace your current HDMI 2.0 receiver to connect your HDMI 2.1 TV to an HDMI 2.1 source.chaos215bar2 - Wednesday, November 29, 2017 - link
"ALLM looks at optimizing latency settings for a wide range of applications and recognizing particularly latency sensitive ones. Details were very sparse, but the given description seems to imply some unmentioned drawback or cost, as otherwise low latency settings would be enabled at all times."I assume the cost is that this will effectively enable game mode on the TV). HDMI itself should only introduce about 1 frame of latency, as this is how long it takes to transmit one frame of data. (I gather QFT changes this. Also, of course, the display doesn't actually need to wait until it has an entire frame buffered before beginning to display it, but this is how most TVs operate in practice.) Any additional delay has always been introduced on either the sending or, more commonly, the receiving side.
On an unrelated note, I hope that just maybe, with HDMI 2.1, they take the CEC spec and/or certification seriously. HDMI CEC is a great idea in theory, but in practice due to what I understand to be a somewhat vague spec and lack of strict certification, it's a complete mess. I've had to disable it (frequently due to one device or another trying to turn on and take control of the TV/receiver when an entirely different device was turned on) in more instances than it's actually worked at all, and the direct control and navigation it theoretically allows of source devices using the TV/receiver remote has simply never worked.
scook9 - Tuesday, December 5, 2017 - link
HDMI-CEC improvements would be amazing!My FiOS boxes and Xbox do not implement it, everything else in my AV stack does. It is always so annoying have to switch the receiver input 3 times before it sticks due to their lack of CEC support.
The excuse they give for no support - not everyone does it in a compatible manner so we just don't even try. Nevermind the fact I have never had 2 incompatible pieces of CEC equipment across multiple vendors.
Mr Perfect - Wednesday, November 29, 2017 - link
I wonder if this will lead to GeForce cards being able to use both FreeSync and G-sync? I certainty hope so.
Currently you buy a card and a matching monitor, and then when it's graphics card upgrade time you buy another card from the same manufacturer to go with the existing screen. Next thing you know the screen is up for an upgrade and you get the same screen tech to go with the existing graphics card... It's just one big loop.
If one graphics card company would break out of that loop, people could buy that company's card regardless of what screen they already had.
euskalzabe - Saturday, December 2, 2017 - link
Wondering the same thing. Here’s hoping nvidia cards will be forced to support VRR through hdmi2.1.Arnulf - Thursday, November 30, 2017 - link
Most of these are utterly irrelevant but HDMI-FreeSync that isn't restricted to a single vendor cannot come soon enough.