Why use V-Sync AND a frame limiter?

  • Thread starter Deleted member 143938
  • Start date
D

Deleted member 143938

Guest
I see this blurb on blur busters:
framerate.JPG


but I still don't understand why vsync is needed if you set a frame rate limit. If vsync only kicks in if FPS go above your monitors refresh rate, wouldn't limiting the frame rate alone to be *below* your monitor's refresh rate mean vsync should/would never kick in to begin with?
 
If you limit the frames your gpu can output to the limit of your monitor, the gpu won’t keep rendering unnecessary frames in the background. If you don’t have vsync on this is when you’d get tearing, because the gpu sends whatever new frames it has and the monitor receives it mid-screen refresh cycle. With vsync you avoid tearing but the gpu will keep rendering despite being “done” with all the frames it needs to output. If it starts processing new frames and then the monitor requests more, there’s an opportunity to create more lag as the gpu was busy rendering a frame that is now outdated/useless and now the monitor needs to wait until a new frames is done. The latter is what you avoid by telling the gpu to render the max frames the monitor can get and then wait to process new frames until they’re needed.

I’m sure someone here will give you a better and more technical explanation, but as far as I understand it that’s the gist of it.
 
If I remember correctly V-Sync enabled with G-Sync helps keeps the frametimes more consistent, it doesn't actually "turn on" its normal behavior until you go past the refresh rate, at which point it adds input lag.
 
With g-sync enabled the v-sync option controls an optional function of g-sync that compensates for sudden frametime spikes that can and will happen in almost all games. So if it's not enabled there can be sudden frametime spikes that cause tearing, even while you are in g-sync range.
 
The technical details as to why v-sync + g-sync have always been confusing to me, and it doesn't help that you have to manually enable it in the NVCP with no real context as to why. It absolutely makes for a smoother experience though, compared to just g-sync alone, and doesn't increase latency. That being said, not all games out there will play nice with v-sync being forced enabled. Most are fine, but at least once per year I run into a game that will have issues with it.
 
To add to the confusion, using VSync with GSync is for the monitors with the GSync modules and it is not apparent will be good for the Compatible GSync monitors without the Nvidia GSync modules.
 
It's the same on monitors without the g-sync module, some of the guys from blurbusters have confirmed it and also battle(non)sense. I guess what used to be handled by the module is now simply done in software. And I think it works exactly the same with AMD and Freesync.

Either way the tearing that happens in VRR without v-sync enabled is generally fairly subtle and limited to a tiny part of the screen so a lot of people will claim they never have tearing. And honestly they're not entirely wrong - if you can't see it it may as well not be happening :)
 
  • Like
Reactions: noko
like this
Thanks guys but I'm not sure anyone answered my actual question, unless I don't understand the explanations provided -- I get why VSync is useful in G-Sync. My question is, if we cap the frame rate to not go *past* the monitor's sync range, VSync shouldn't activate, because it will never go past the specified refresh rate threshold that would cause vsync to kick in anyway.

Correct me if I'm wrong but vsync isn't always on if enabled, it's only on if you go above your monitor's refresh rate (and then tearing will occur).

So if you limit the frames, vsync should never kick in anyway - right? (clearly not, otherwise blur busters wouldn't say to turn on both). So why do they recommend both frame limiter AND vsync on if the frame limited will nullify vsync? (again, I'm obviously not understanding, so trying to get clarification on why both are needed still)
 
Perhaps this explains it.
Just to be clear, it doesn't matter if the GPU is rendering faster or slower than the monitors refreshrate. If you're getting an fps that isn't a multiple or a 1/x fraction of the monitors refresh rate there will be tearing. Even if you lock you're fps to 60, 120, 30, etc. you'll still get tearing if you don't have something like v-sync to take care of the timing.

Test it out. Take a game with frame limiting and set it to 60 fps and play it. Then enable vsync with frame limiting and see if you notice a difference.

I can tell you from experience that simply using a frame limiter will not eliminate screen tearing.
 
Thanks guys but I'm not sure anyone answered my actual question, unless I don't understand the explanations provided -- I get why VSync is useful in G-Sync. My question is, if we cap the frame rate to not go *past* the monitor's sync range, VSync shouldn't activate, because it will never go past the specified refresh rate threshold that would cause vsync to kick in anyway.

Correct me if I'm wrong but vsync isn't always on if enabled, it's only on if you go above your monitor's refresh rate (and then tearing will occur).

So if you limit the frames, vsync should never kick in anyway - right? (clearly not, otherwise blur busters wouldn't say to turn on both). So why do they recommend both frame limiter AND vsync on if the frame limited will nullify vsync? (again, I'm obviously not understanding, so trying to get clarification on why both are needed still)

Read above: when you are using g-sync, the v-sync option is no longer just "the v-sync option", it controls a core (but optional) functionality of g-sync (in addition to enabling v-sync once you hit your refresh rate).

Originally turning off v-sync was not even possible with g-sync (for the first 2-3 years after g-sync release). It was literally greyed out because you get the best experience by having it enabled. They later opened the option because AMD did it when they released Freesync. But having it optional is a dubious feature if you ask me, especially when you don't explain to your users what it means and they have to go to third party sites to find out why they are getting tearing with VRR enabled (nvidia forum is still full of users complaining about g-sync not working because of it).
 
It's the same on monitors without the g-sync module, some of the guys from blurbusters have confirmed it and also battle(non)sense. I guess what used to be handled by the module is now simply done in software. And I think it works exactly the same with AMD and Freesync.

Either way the tearing that happens in VRR without v-sync enabled is generally fairly subtle and limited to a tiny part of the screen so a lot of people will claim they never have tearing. And honestly they're not entirely wrong - if you can't see it it may as well not be happening :)
Thanks, wished Nvidia would be clear on this for GSync with the different versions of modules and the compatible GSync monitors. GSync seems to work fine without VSync and with frame limiter on my Samsung FreeSync monitor, I can open up the monitor on screen display and see the frequency of the monitor following the frame rate, frame rate compensation works when the monitor goes below the VRR range. Frame limiter if in game normally works just fine. If game does not have a frame rate limiter than VSync by the control panel might be the second best option. 3rd party frame rate limiters may add additional lag overall and maybe last choice. Using AMD cards Chill I think works wonderfully well for this, using Enhanced sync with FreeSync I've seen screen tearing (which it is suppose to minimize) but does not.
 
I've noticed on AMD GPUs, I can safely disable V-Sync and there is no tearing with FreeSync and a frame limiter (or Radeon Chill).

After discovering this, I stopped forcing V-Sync on in Nvidia and it still works. Not sure if they updated something, or if it has to do that I have a FreeSync monitor now, but I don't see any tearing.

Test it yourself. Leaving V-Sync off seems to be the better option, as long as you can ensure you stay within the sync range.
 
Last edited:
Hmm I'm even more confused now. I disabled VSync in NVCP and went to play a game with a built in forced 60 FPS+VSYNC mode (The Evil Within). I didn't get tearing (perhaps because vsync is forced), but my display OSD shows the vertical hz to be 165.. even though the game option claims 60FPS/VSYNC... *AND* I have a 162 FPS limit set in NVCP.

How can this be explained? Shouldn't the monitor display 60hz? Or if the in-game vsync is broken, shouldn't it display 162 at most? But it's displays 165.1-165.3hz and no tearing.
 
Hmm I'm even more confused now. I disabled VSync in NVCP and went to play a game with a built in forced 60 FPS+VSYNC mode (The Evil Within). I didn't get tearing (perhaps because vsync is forced), but my display OSD shows the vertical hz to be 165.. even though the game option claims 60FPS/VSYNC... *AND* I have a 162 FPS limit set in NVCP.

How can this be explained? Shouldn't the monitor display 60hz? Or if the in-game vsync is broken, shouldn't it display 162 at most? But it's displays 165.1-165.3hz and no tearing.

Sounds like VRR wasn't working in that game and you were in fixed refresh mode with good old v-sync. Or display OSD is busted and lying to you. Maybe the game is not running in true fullscreen mode?
 
Sounds like VRR wasn't working in that game and you were in fixed refresh mode with good old v-sync. Or display OSD is busted and lying to you. Maybe the game is not running in true fullscreen mode?
You're right, it wasn't working in this game. I verified by enabling the g-sync watermark from the NVCP and it wasn't showing in this game until I changed the G-Sync setting from full screen to windows and full screen. So I guess the game isn't running in true full screen.

But now it the hz changes from 48 to 165 every second and it's all stuttery and low FPS feeling. I tried playing around with low latency mode (ulta/on/fast) and changing the max frame rate in NVCP from 161 to 50 (still would toggle 48 to 165 in-game), and also turning max frame rate setting off... same results.

Game works fine in fixed refresh rate, no complaints. But it would be cool to know the technical reason of why this is behaving so oddly - so is the game bugged with GSYNC? Or is it because freesync and windowed VRR with nVidia GPU is buggy? Why is my frame limit in NVCP not applying regardless.. how to go about fixing the stutter if I did want to try keeping VRR?
Too many questions right now!! This is very confusing to me


edit: narrowed it down the to FPS limit option in NVCP. Setting any frame limit in the game causes these weird judders/frame drops. If I turn off frame limit in NVCP then it's more stable but there are some judders still, but far far less than with frame limit on. I think this game doesn't work well with VRR though, at least not on my nVidia GPU/FreeSync monitor setup.

edit2: and now I can't get VRR to activate in the game again no matter what I try... shrug.
 
Last edited by a moderator:
Windowed g-sync is broken right now, it's not a game issue. I spoke about it there but this thread has a guy who did a lot of testing, including with different GPUs and monitors (even ones with a g-sync module, to compare with g-sync compatible).

Nvidia and/or Microsoft break it from time to time unfortunately. Vulkan and DX12 titles should remain unaffected. But anything else, you're at the mercy of DWM quirks. It sucks. I've reported it to nvidia the other since there is one old game I'd like to play in borderless mode but it's going to take time.

I don't know if AMD suffers from the same thing but very very probably. It took them a long time to support windowed VRR in the first place and Microsoft has a tendency to change the way DWM works in major Win 10 updates which is what creates this mess in the first place.
 
Windowed g-sync is broken right now, it's not a game issue. I spoke about it there but this thread has a guy who did a lot of testing, including with different GPUs and monitors (even ones with a g-sync module, to compare with g-sync compatible).

Nvidia and/or Microsoft break it from time to time unfortunately. Vulkan and DX12 titles should remain unaffected. But anything else, you're at the mercy of DWM quirks. It sucks. I've reported it to nvidia the other since there is one old game I'd like to play in borderless mode but it's going to take time.

I don't know if AMD suffers from the same thing but very very probably. It took them a long time to support windowed VRR in the first place and Microsoft has a tendency to change the way DWM works in major Win 10 updates which is what creates this mess in the first place.
Subnautica works g-sync'd non-fullscreen for me.
I did some googleing on The Evil Within and it's a common issue with stutters on g-sync. It's because of how the engine limits frames to 60fps. there is a way to uncap with a console command in the game (that would hopefully have eliminated the judders) but now I can't get it to g-sync on the darn game at all! It worked (brokenly) before.. now can't get it to activate at all, lol. What a mess.
 
Back
Top