If I have 300Hz display, but GPU only puts out 100 frames a sec, any benefit?

BenWah

Weaksauce
Joined
Jan 21, 2014
Messages
98
I'm wondering if it is worth paying a premium for a display with a much higher refresh rate than my GPU can match.

Does it provide 0 benefit? A little? Less motion blur or something?
Yes this is a naive question, i really don't know.
 
No benefit, just headroom. The display might have very good response times so that it can handle that 300 Hz though.

I'd rather have a 1440p 240 Hz display than a 1080p 360 Hz display.
 
Depends on how good the signal processing in the monitor is. You can still see a benefit in input lag if the games you play don't capture input at a fixed rate. And regardless of the framerate you will still get an increase in motion clarity if the pixel response time is fast enough to keep up with the refresh rate. You would need to use something like scanline sync in order to eliminate the inevitable tearing, though, since you're not going to want to use triple buffered V-Sync. If you use G-SYNC then all that headroom might go wasted, in which case I would look for a better monitor with a max refresh rate closer to the performance you can expect with your PC. Max refresh rate with G-SYNC is ideal when it is about 1.5 times your average framerate, so if you get 100 FPS then a 144-165 Hz monitor would be ideal.
 
The main issue with gaming at a high refresh rate is keeping the minimum FPS above the refresh rate or it will stutter like mad. Having the latest and greatest graphics card doesn't matter too much because you can always turn down the eye candy until it succeeds, but the CPU/RAM has to be up to the job. I don't play competitively, but my son does.
 
The main issue with gaming at a high refresh rate is keeping the minimum FPS above the refresh rate or it will stutter like mad. Having the latest and greatest graphics card doesn't matter too much because you can always turn down the eye candy until it succeeds, but the CPU/RAM has to be up to the job. I don't play competitively, but my son does.
I'm not sure I follow. That would mean to me on a 300Hz display you'd want your minimum frame rate to never drop below 300? Sheesh, my 3090 never see's framerates that high but I also don't play at 1080p.

I'm perfectly content with my 100Hz Gsync 3440x1440 ultrawide for single player games and some, as of late, casual Apex Legends (game is super fun). I know of lot of these competitive gamers are still playing in 1080p which is crazy to me who moved to 2560x1600 in 2010.
 
I'm not sure I follow. That would mean to me on a 300Hz display you'd want your minimum frame rate to never drop below 300? Sheesh, my 3090 never see's framerates that high but I also don't play at 1080p.

I'm perfectly content with my 100Hz Gsync 3440x1440 ultrawide for single player games and some, as of late, casual Apex Legends (game is super fun). I know of lot of these competitive gamers are still playing in 1080p which is crazy to me who moved to 2560x1600 in 2010.

Yeah, it's pretty much a 1080P thing right now. CS:GO, Overwatch, etc. All fast-twitch speed, quality or realism not important. It's not my style of gaming either, but I respect the difference in preference.
 
That's my main problem with high refresh monitors. I had a 144hz gsync ultimate monitor earlier and the stuttering was terrible when the framerate dropped under 110. Eventually I just settled with a custom 100hz setting and no more stutters.
 
There is benefit
Frame is still drawn on the screen and it takes time.
144Hz monitor would draw frame in almost 6.94ms and 360Hz in almost 2.78ms. Difference between these numbers will be input lag improvement at the bottom of the screen. Half that in the middle.
Another benefit are videos. The higher frame rate the better. 60Hz videos on 60Hz are notoriously difficult to synchronize, especially on YT. Much better at 120Hz as any timing difference by frame which is drawn too fast or too slow will be less visible. If videos used VRR this wouldn't be an issue but somehow this isn't a thing.

Generally I'd rather recommend something with 2560x1440 240Hz or 3840x2160 160Hz if we are talking about these 1920x1080 360Hz monitors because 1920x1080.
1920x1080 on 3840x2160 monitor looks very nice when using integer scaling.
 
No benefit, just headroom. The display might have very good response times so that it can handle that 300 Hz though.

You forgot another benefit: Input lag.

Assuming GtG is the same, and you’re not getting degraded quality (due to a specific higher Hz panel), you have lower input lag for 100fps at 240Hz than 100fps at 144Hz.

Your 100fps frame can fully display in 1/240sec, instead of 1/144sec, reducing input lag.

Higher Hz is always good for reducing input lag of lower frame rates, if all else is equal.

Same GtG heatmap, same monitor motherboard processing performance, etc.

However, a superior-quality 144Hz can be better than bottom-barrel 240Hz.

Also, for VRR fans (FreeSync, G-SYNC), there is a beauty of having framerate ranges completely within your VRR range. Your 50fps-200fps framerate fluctuation completely fits inside a 240Hz G-SYNC monitor without needing a cap.
 
Last edited:
There is benefit
Frame is still drawn on the screen and it takes time.
144Hz monitor would draw frame in almost 6.94ms and 360Hz in almost 2.78ms. Difference between these numbers will be input lag improvement at the bottom of the screen. Half that in the middle.
This is true for VRR and VSYNC ON.

With VSYNC OFF, the latency of the screen location does not matter because the frameslices are streaming realtime onto the scanout.

Depending on where the various frameslices starts, the bottom edge can sometimes (intermittently) have less input lag than top edge.
  • You have +0ms lag right underneath a tearline.
  • You have +(frametime) lag right above a tearline.

scanout-filmstrip-120hz-360fps-VSYNC-OFF.png

(From the Blur Busters Area 51 Scanout Latency FAQ)

A frameslice is the frame that manages to display between two consecutive tearlines. If the beginning of the frameslice is the last tearline on screen, the frameslice continues refreshing at the top the NEXT refresh cycle.

For random tearlines in randomized framerates, the entire screen surface is identical latency (TOP = CENTER = BOTTOM), when averaged over hundreds of samples. Assuming cable scanout and panel scanout is synchronized.

Obviously, the higher the Hz during VSYNC OFF, the more image is visible to eyes between tearlines for a given frametime. In other words, bigger frameslice for the same framerate at higher Hz. So 1/300sec is a fractional-screen frameslice at 144Hz but a more-than-one-screenful frameslice at 360Hz.
 
Last edited:
TLDR - run this scenario with Gsync (or similar) and you are down to 100hz in the original scenario but with the best compromise of lag and image stability (I think?)
 
TLDR - run this scenario with Gsync (or similar) and you are down to 100hz in the original scenario but with the best compromise of lag and image stability (I think?)
Incidentially, G-SYNC is good for low-latency low-framerates.

G-SYNC is known as the the world lowest-lag "NON-vsync-off" technology for low frame rate material, which makes G-SYNC very popular for lower emulator latency. It's preferred over VSYNC ON (mandatory for things like emulators or other 60fps-selfcapped content like fighting games).

Only VSYNC OFF has less lag, but you can't use with certain software such as emulators.

The higher the max-Hz of your VRR range, the lower lag your low framerates becomes. A 60fps frame on a 280 Hz monitor, is displayed in only 1/280sec during G-SYNC mode.

(Plus any GtG lag and a number of tens/hundreds microseconds of tapedelay-style lag from scaler processing -- things like picture/overdrive/color processing in scaler/TCON. It's currently done in a linebuffered manner (pixel row buffers), rather than full refresh manner (full frame buffer in monitor), nowadays, on most VRR panels.)
 
Another thing to consider is frametime variance. Even if your average FPS counter says 100 fps, half the frames could be at 50 fps and the other half at 150 fps (as an example) and it would average to 100 fps.

If you were on a 100 Hz monitor, you would lose the benefit of those faster frames, and also get stutter when the frames took longer. But on a 300 Hz display (with VRR) each frame would appear as it was rendered, reducing stutter and also becoming smoother (since all the frames would be shown).
 
Back
Top