On HDMI 2.1, you can do 8K30fps before you have to compress the stream with DSC, which is “visually lossless”, so actually lossy. We don’t even have 5K120fps or 4K240fps without compression. These are common refresh rates for gaming. So you could say that the highest resolution that supports all use cases without compromises is 1440p. That’s definitely not enough even by today’s standards. I think you’re underestimating the time it takes for standards to reach widespread adoption. The average viewer is not going to have access to it until the technology is cheap and enough time has passed for at least a few hundred million units to have reached the market. If you think you’re going to be using it in 2030 or later for your “average broadcast”, then it needs to be designed today.
Of course HDMI is shit for reasons you mention. DisplayPort is better, but it’s not an open standard either and it supports DRM as well. But designing for higher bandwidth is necessary and it has nothing to do with HDMI’s problems.
On HDMI 2.1, you can do 8K30fps before you have to compress the stream with DSC, which is “visually lossless”, so actually lossy. We don’t even have 5K120fps or 4K240fps without compression. These are common refresh rates for gaming. So you could say that the highest resolution that supports all use cases without compromises is 1440p. That’s definitely not enough even by today’s standards. I think you’re underestimating the time it takes for standards to reach widespread adoption. The average viewer is not going to have access to it until the technology is cheap and enough time has passed for at least a few hundred million units to have reached the market. If you think you’re going to be using it in 2030 or later for your “average broadcast”, then it needs to be designed today.
Of course HDMI is shit for reasons you mention. DisplayPort is better, but it’s not an open standard either and it supports DRM as well. But designing for higher bandwidth is necessary and it has nothing to do with HDMI’s problems.