That makes 0 latency in the monitor, but how much latency is there in the drivers that convert a digital image to analogue signals? Isn’t the latency just moved to the PC side?
I warn you before you dive in, this is a rabbit hole. Some key points (not exact, but to make things more layman): You don’t see in digital, digital is “code”. You see in analog, even on an LCD (think of sound vs video, its the same thing). Digital-only lacked contrast, brightness, color, basically all adjustments. So the signal went back and forth, adding even more latency.
Maybe think of it like a TVs game mode, where all the adjustments are turned off to speed up the digital to analog conversions.
Or like compressed video (digital) vs uncompressed video (analog), where the compression means you can send more data, but latency is added because it is compressed and uncompressed at each end.
That makes 0 latency in the monitor, but how much latency is there in the drivers that convert a digital image to analogue signals? Isn’t the latency just moved to the PC side?
I warn you before you dive in, this is a rabbit hole. Some key points (not exact, but to make things more layman): You don’t see in digital, digital is “code”. You see in analog, even on an LCD (think of sound vs video, its the same thing). Digital-only lacked contrast, brightness, color, basically all adjustments. So the signal went back and forth, adding even more latency.
Maybe think of it like a TVs game mode, where all the adjustments are turned off to speed up the digital to analog conversions.
Or like compressed video (digital) vs uncompressed video (analog), where the compression means you can send more data, but latency is added because it is compressed and uncompressed at each end.