What are your thoughts on frame generation? The debate seems to revolve around input latency.
I suppose this is something that should be examined on a game by game basis: if you’ve maxed out a game at 4K, say, and you’re getting roughly 50 fps with frame generation off, but 70 with it on, then you might want to leave it on.
If, on the other hand, you’re already well above 60 fps then you might decide to leave frame gen off due to increased input latency.
Also, wouldn’t it depend on the nature of the game itself? A slow RPG might benefit from frame generation, whereas an fps might suffer.
All of this is theoretical, of course - so what are your real-world experiences?
I’m playing Cyberpunk at 4K (max settings with path tracing enabled) on my new 4090 right now, and I have frame gen enabled in order to stay above 60 fps. Alternatively, I might drop the path tracing, and the frame gen, and play at near 100 fps.
Thoughts on this?
I suppose this is something that should be examined on a game by game basis: if you’ve maxed out a game at 4K, say, and you’re getting roughly 50 fps with frame generation off, but 70 with it on, then you might want to leave it on.
If, on the other hand, you’re already well above 60 fps then you might decide to leave frame gen off due to increased input latency.
Also, wouldn’t it depend on the nature of the game itself? A slow RPG might benefit from frame generation, whereas an fps might suffer.
All of this is theoretical, of course - so what are your real-world experiences?
I’m playing Cyberpunk at 4K (max settings with path tracing enabled) on my new 4090 right now, and I have frame gen enabled in order to stay above 60 fps. Alternatively, I might drop the path tracing, and the frame gen, and play at near 100 fps.
Thoughts on this?