4

I was trying out FSR4 on my RX 6800 XT, Fedora 42. Works really well and it easily beats FSR3 in visuals even on Performance. It does have a significant performance hit vs FSR3 though but it still works out to be a bit faster than a native rendering on Quality.

you are viewing a single comment's thread
view the rest of the comments
[-] victorz@lemmy.world 0 points 3 weeks ago

Input lag is caused by frame interpolation, right? Or nah?

[-] DarkAri@lemmy.blahaj.zone 0 points 3 weeks ago* (last edited 3 weeks ago)

It's because game logic is calculated on real frames and these things lower the real frame rate even though they give you more rendered frames. If you were getting 40 real FPS, and then you go to 30 real fps, you will feel a significant amount of lag even if you are getting 60 fps in fake frames. Basically the game loop is running slower and stuff like input polling is happening slower even if you have a higher frame rate.

[-] victorz@lemmy.world 0 points 3 weeks ago

Frame generation shouldn't be a bottleneck on the CPU though, should it? That stuff is happening on the GPU. I know I saw a video about this stuff but I can't remember the real reason input lag increases with frame generation/interpolation.

[-] WereCat@lemmy.world 0 points 3 weeks ago

it’s not. The whole point of FG was to take advantage of high refresh rate monitors as most games can’t render 500FPS even on the fastest CPU… alas, here we are with games requiring FG to get you to 60FPS on most systems looks at Borderlands 4 and Monster Hunter Wilds

[-] victorz@lemmy.world 0 points 3 weeks ago

Right, but FG shouldn't be touching the CPU in any way, should it? It should be a local thing on the GPU transparent to the CPU, unless I'm misunderstanding how it works.

[-] WereCat@lemmy.world 1 points 3 weeks ago
[-] victorz@lemmy.world 1 points 3 weeks ago

Then I don't understand how it would affect the game loop negatively. I'll look into it though, will do some research.

[-] WereCat@lemmy.world 0 points 3 weeks ago

It's kinda the same thing. You get input lag based on the real framerate. Since interpolation requires some extra performance the base framerate will likely be a bit lower than the framerate without interpolation which will case an increase in input lag while providing smoother image.

[-] victorz@lemmy.world -1 points 3 weeks ago

It seems that the input lag is more perceived, rather than actually experienced, from what I understand. Like if you go from 30 to 120 fps, you expect the input lag to decrease, but since it stays the same (or slightly worse), you perceive it to be much more severe.

[-] DarkAri@lemmy.blahaj.zone 0 points 3 weeks ago

The frame rate isnt going from 30 to 120 FPS. It's actually going from 30 to like 20. The rendered frames are different then the CPU frames which handles the game loops, (physics, input, simulation, etc)

[-] victorz@lemmy.world 0 points 3 weeks ago

Not sure we have the same definition of frames here.

[-] DarkAri@lemmy.blahaj.zone 0 points 3 weeks ago

Generated frames are created using a neural network, they have nothing to do with the actual game scripts and game loop and input polling and stuff. FSR does generate frames to interpolate between real frames but things like physics and input are not being generated as well. It's only visual. I guess maybe you have to have some basic knowledge about how a computer program and game engine works to understand this.

Basically the CPU steps through the simulation in steps. When you use frame gen, if it lowers the actual frame rate, then the CPU is making less loops per second over everything, like the physics updates, input polling(capturing key presses and mouse events), and other stuff like this.

[-] victorz@lemmy.world 1 points 3 weeks ago

Oh yeah, now I remember why there's more input lag with frame interpolation turned on. Taking a shot right now and now it pops into my head.

Anyway, it's because while the frame interpolation adds more frames per second, the "I-frames"—or real frames—you're seeing are lagging behind one I-frame. This is because it can't start showing you interpolated frames until it has two frames it can interpolate between.

So you won't start seeing I-frame N-1 until I-frame N (the latest I-frame) has been generated, thus creating extra input lag.

Someone correct me if I'm wrong, I'm supposed to be asleep...

[-] WereCat@lemmy.world 0 points 3 weeks ago

yes, that’s why FPS in this case is not a good measure of performance

[-] victorz@lemmy.world 1 points 3 weeks ago

Very much so. The very reason why we want more fps is to have less input lag, that's my personal take anyway. That's the only reason why I have a beefy computer, so the game can respond quicker (and give me feedback quicker as well).

this post was submitted on 28 Sep 2025
4 points (100.0% liked)

Linux Gaming

21717 readers
49 users here now

Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.

This page can be subscribed to via RSS.

Original /r/linux_gaming pengwing by uoou.

No memes/shitposts/low-effort posts, please.

Resources

WWW:

Discord:

IRC:

Matrix:

Telegram:

founded 2 years ago
MODERATORS