Comment on Why do low framerates *feel* so much worse on modern video games?
tomkatt@lemmy.world 1 day ago
Couple things. Frame timing is critical and modern games aren’t programmed as close to the hardware as older games were.
Second is the shift from CRT to modern displays. LCDs have inherent latency that is exacerbated by lower frame rates (again, related to frame timing).
Lastly with the newest displays like OLED, because of the way the screen updates, lower frame rates can look really jerky. It’s why TVs have all that post processing and why there’s no “dumb” TVs anymore. Removing the post process improves input delay, but also removes everything that makes the image smoother, so higher frame rates are your only option there.
Lojcs@lemm.ee 1 day ago
First time hearing that about OLEDs, can you elaborate? Is it that the lack of inherent motion blur makes it look choppy? As far as I can tell that’s a selling point that even some non-oled displays emulate with backlight strobing, not something displays try to get rid of.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are fastest than 60hz crts
tomkatt@lemmy.world 1 day ago
I don’t understand all the technicals myself but it has to do with the way every pixel in an OLED is individually self-lit. Pixel transitions can be essentially instant, but due to the lack of any ghosting whatsoever, it can make low frame motion look very stilted.
That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
The LCD latency has to do with input polling and timing based on display latency and polling rates. Also, there’s added latency from things like wireless controllers as well.
The actual frame rate of the game isn’t necessarily relevant, as if you have a game at 60 Hz in a 120 Hz display and enable black frame insertion, you will have reduced input latency at 60 fps due to doubling the refresh rate on the display, increasing polling rate as it’s tied to frame timing.
This is why, for example, the Steam deck OLED has lower input latency than the original Steam Deck. It can run up to 90Hz instead of 60, and even at lowered Hz has reduced input latency.
Also, regarding LCD, I was more referring to TVs since we’re talking about old games (I assumed consoles). Modern TVs have a lot of post process compared to monitors, and in a lot of cases there’s gonna be some delay because it’s not always possible to turn it all off. Lowest latency TVs I know are LG as low as 8 or 9ms, while Sony tends to be awful and between 20 and 40 ms even in “game mode” with processing disabled.
moody@lemmings.world 1 day ago
Essentially, the speed of the beam determined how many lines you could display, and the more lines you tried to display, the slower the screen was able to refresh. So higher resolutions would have lower max refresh rates. Sure, a monitor could do 120 Hz at 800x600, but at 1600x1200, you could probably only do 60 Hz.
Lojcs@lemm.ee 1 day ago
That’s why I specified 60hz :)
I see that you meant TVs specifically but I think it is misleading to call processing delays ‘inherent’ especially since the LG TV you mentioned (which I assume runs at 60hz) is close to the minimum possible latency of 8.3ms.
tomkatt@lemmy.world 1 day ago
True, but even that is higher than the latency was on the original systems on CRT. My previous comments were specific to display tech, but there’s more to it.
Bear in mind I can’t pinpoint the specific issue for any given game but there are many.
Modern displays, even the fastest ones have frame buffers for displaying color channels. That’s one link in the latency chain. Even if the output was otherwise equally fast as a CRT, this would cause more latency in 100% of cases, as CRT was an analogue technology with no buffers.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
I mentioned TVs above re: post processing.
Sometimes delays are added for synchronizing data between CPU and GPU in modern games, which can add delays.
Older consoles were simpler and didn’t have shaders, frame buffers, or anything of that nature. In some cases the game’s display output would literally race the beam, altering display output mid-“frame.”
Modern hardware is much more complex and despite the hardware being faster, the complexity in communication on the board (CPU, GPU, RAM) and with storage can contribute to perceived latency.
Those are some examples I can think of. None of them alone would be that much latency, but in aggregate, it can add up.