So it sounds like maybe our shared nightmare (Hello HDR10, what’s that? Dolby Vision says you need to add dynamic metadata for good HDR so we’re already seeing some TVs coming with HDR10+ so there are now three standards for HDR and that’s before anyone remembers that HDMI has always been able to do >8-bit colour thanks to old Deep Color, even if no consumer stuff really used that option) could come to an end soon?
Oh, and HDMI 2.1 will actually have enough bandwidth for all this data so all these HDMI 2.0 systems that match a 1080p colour image with a 4K grayscale one (aka 4:2:0 chroma subsampling) are going to look rather sad next to the actually finalised systems that can do 10-bit HDR signals in true 4K rather than either doing colour subsampling or dropping to 8-bit colour (which is a bit of a waste if you’re buying a TV with a 10-bit panel). [That’s somewhat worst case, hopefully your systems handshake to do 4:2:2 so you’re getting 4Kx2K grayscale with a 2Kx2K colour - still not actually capable of sending correct 60Hz output of what your expensive GPU has just calculated to every pixel on the panel in your 4K TV but not literally 1080p colour.]
This is why I’m of the view now is still not really the time to buy a $1500 4K TV. You need to grab something to replace your old TV? $300-600 4K TVs are pretty good right now, not the brightest for HDR but not too bad and they’re just now getting VA panels so they have far inkier blacks than the IPS panels they’re replacing (moving from 1000:1 static contrast up to 4-7k:1). But at some point you’ll want to have HDMI 2.1 inputs to actually get every pixel on the panel you’re paying for to be sent the data your GPU has carefully worked out rather than throwing a load of that data away because there isn’t the bandwidth for HDR & true 4K down the cable right now.