It seems like HDMI 2.0 running at full clock (600mhz?) can only get around 30ms of input lag. But when they downclock it to 300mhz (TV using Game Mode), the input lag goes down to low 20ms. I'm merely using the latest Samsung TVs as an example. We can see the downclock in gamemode due to lower bandwidth, resulting in limitation of 4k 60hz @ 4:2:2.
And then, we look at other competition of TVs using HDMI 2.0 and all of them can only do around 30ms when running at its full advertised clockspeed. Never have I seen a 4k TV with HDMI 2.0 go under 30ms of input lag with 4:4:4 support, LG OLEDs included.
So in theory, I think the higher clock of HDMI 2.0 actually causes higher latency, resulting in higher input lag. Am I wrong? Any research for this matter? Are we going to be stuck with HDMI 2.0 for a long time?
And then, we look at other competition of TVs using HDMI 2.0 and all of them can only do around 30ms when running at its full advertised clockspeed. Never have I seen a 4k TV with HDMI 2.0 go under 30ms of input lag with 4:4:4 support, LG OLEDs included.
So in theory, I think the higher clock of HDMI 2.0 actually causes higher latency, resulting in higher input lag. Am I wrong? Any research for this matter? Are we going to be stuck with HDMI 2.0 for a long time?