RTX 4090 and DLSS: Turn ON or leave OFF?

Cannibal Corpse

[H]ard|Gawd
Joined
Sep 22, 2002
Messages
1,277
Hello all,
Should I enable DLSS (on any given game) with my newly purchased RTX 4090? My system specs are the following:

Ryzen 7800X3D
GIGABYTE B650 Aorus Elite AX
CORSAIR VENGEANCE RGB DDR5 RAM 32GB (2x16GB) 6000MHz CL36
Crucial T700 2TB Gen5 NVMe M.2 SSD
RTX 4090 | CREATIVE Sound Blaster ZxR Sound Card
SEASONIC 850W Prime Titanium PSU |
Sony BRAVIA 55" OLED Display (XR-55A80K) 4K @ 120Hz

Should I enable DLSS?
 
It's oversimplifying, but it usually comes down to visual quality vs. performance. Turning DLSS on usually leads to a decrease in visual quality, but it increases your frames per second. Most games offer varying levels of DLSS, so you can select quality/performance/ultra performance/etc. depending on your personal preferences. If you're looking to get a full 120fps on your TV, you'll probably need to enable it for most newer games. At least the "quality" version, which should look the best. If you're happy with 60fps, you probably won't.
 
Sometimes the use of DLSS "masquerades" artifacts that are naturally there at native resolution, in other words, for some things, it can make it appear to be "better".

Regardless, as we're now seeing, the "future" (as Nvidia stated) now assumes "upscaling". Doesn't matter what you or I think on the matter.
 
I always prioritize framerate. 4k where applicable, DLSS if I'm going to be averaging less than 90 fps. Ideally I want to be in VRR fluctuating between 110-144 FPS in all my games, and willing to sacrifice visuals to get there. Begin a game in native res with Ultra/maxed out settings, then move to DLSS Quality or Balanced, then turn down individual graphics settings until I hit that sweet spot.
 
Depending on the game, DLSS @ Quality can lead to a slight increase in visual quality along with a small performance bump. It really depends on which version of DLSS is being used and how it’s implemented in a game, though. In general, if you like the framerate with DLSS off, then leave it off.
 
I prefer the sharpening effect of DLSS, so I usually turn it on if available compared to native resolution. However if DLAA is available at native resolution as an anti-aliasing option, I would take that over DLSS, since it doesn't tend to have the added graphical artifacts of DLSS.
 
Hello all,
Should I enable DLSS (on any given game) with my newly purchased RTX 4090? My system specs are the following:

Ryzen 7800X3D
GIGABYTE B650 Aorus Elite AX
CORSAIR VENGEANCE RGB DDR5 RAM 32GB (2x16GB) 6000MHz CL36
Crucial T700 2TB Gen5 NVMe M.2 SSD
RTX 4090 | CREATIVE Sound Blaster ZxR Sound Card
SEASONIC 850W Prime Titanium PSU |
Sony BRAVIA 55" OLED Display (XR-55A80K) 4K @ 120Hz

Should I enable DLSS?
As others have said, it depends on the game. With newer AAA titles you will either have to turn the game's visual settings down significantly or use DLSS to get playable frame rates at 4K. Cyberpunk 2077 and Starfield are two very good examples of that. There isn't a gaming machine on the planet that can get 60FPS+ at max settings in Cyberpunk 2077 with ray tracing and not use DLSS. While Starfield doesn't expressly support DLSS at present, it does still use FSR which is a similar technology and it will be getting DLSS soon. Not to mention modders have added DLSS support to Starfield already.

You can max the quality settings out in any given game and if it supports DLSS and is still a slide show, you turn DLSS on. It's that simple.
 
I’m playing Lords of the Fallen (2023), and it has a resolution scale (default set to 50%):

Should I leave it at 50% (I get 112 FPS), or crank it all the way to 100% (dropping my FPS to 60ish)

Why this option is even there , I don’t understand it. Kidney explain.
 
I’m playing Lords of the Fallen (2023), and it has a resolution scale (default set to 50%):

Should I leave it at 50% (I get 112 FPS), or crank it all the way to 100% (dropping my FPS to 60ish)

Why this option is even there , I don’t understand it. Kidney explain.
Resolution scaling at 50% means that the game isn't rendering at 4K. It's rendering at half that and then upscaling the image to 4K for a performance increase. This however, does not look at good as simply running the game at 100% scaling which would be native resolution.
 
I’m playing Lords of the Fallen (2023), and it has a resolution scale (default set to 50%):

Should I leave it at 50% (I get 112 FPS), or crank it all the way to 100% (dropping my FPS to 60ish)

Why this option is even there , I don’t understand it. Kidney explain.

Essentially that 50% scale means that the game is rendering at 1/2 what your in-game resolution is. If you're playing in 4K, the game is rendering at 1/2 that and using DLSS to "disguise" that.
If you crank it to 100% it should look better (it's higher resolution), but at the cost of 60fps. Your decision should be based on whether you care more about the extra 60 frames per second or the visuals looking better. Personally, I'd take the frames every time, but some people feel the exact opposite. It's a preference thing and it also depends on how much better you think the 100% vs. 50% looks. You might barely notice and you might think it looks night and day worse.
 
To be honest at 50% and DLSS off, I didn’t notice any difference when I cranked it all the way to 100%
 
DLSS Quality is better than native + TAA. If the temporal anti-aliasing is particularly bad in a given game, then DLSS is preferable. It depends on the game, though. I always run resolution scale at 100%, regardless.
 
I always use DLSS (quality) unless the implementation is broken.

Or I'm really cold like the last few days, then I would run native + DLAA if I have headroom in that game.
 
Last edited:
OK, I just did DLSS Quality, and 100% scaling. My FPS jumped from low 60 to 110!
Is this a good combination?
 
OK, I just did DLSS Quality, and 100% scaling. My FPS jumped from low 60 to 110!
Is this a good combination?
No it is terrible... lol.

Of course it is dope af. You can play around with DLSS quality / DLAA (when offered) and frame gen (when offered) to find the best fps/visual quality combo you like. For me it is DLSS Quality and frame gen as I like high fps more than minor quality deltas that DLAA brings.
 
always try to play the game at native without any upscaling first...if you're not getting the frames you want then turn on DLSS
 
I always use DLSS (quality) unless the implementation is broken.

Or I'm really cold like the last few days, then I would run native + DLAA if I have headroom in that game.
Good point I missed. DLAA is preferable over DLSS if it's available.
 
Also keep in mind that although turning on DLSS increases frame rate numbers it also increases your latency. So it’s still a trade off. I’d say if your frames are already playable leave it off.
 
No it reduces latency by improving performance. Unless you mean framegen, which does not improve latency indeed, but it does not really make it noticeably worse either. It's more of a neutral effect, but since it forces devs to add Reflex, it's going to be a latency improvement for most people (as happened in CP77, there was no Reflex option before they added frame gen so people played with WAY worse latency and were fine with it).
 
DLSS Quality is the bees knees. At least in the titles I have, I never see artifacting. At the same time, I do prefer it's AA to other forms, and it gives a nice boost as well.
I can spot some issues in DLSS Balanced and below, so tend to not use those.
 
DLSS Quality is the bees knees. At least in the titles I have, I never see artifacting. At the same time, I do prefer it's AA to other forms, and it gives a nice boost as well.
I can spot some issues in DLSS Balanced and below, so tend to not use those.
If you look hard enough typically you can see it show up in spots. It certainly isn't often that it's obvious but I've had a few moments in CP2077 for example where it was.
 
No it reduces latency by improving performance. Unless you mean framegen, which does not improve latency indeed, but it does not really make it noticeably worse either. It's more of a neutral effect, but since it forces devs to add Reflex, it's going to be a latency improvement for most people (as happened in CP77, there was no Reflex option before they added frame gen so people played with WAY worse latency and were fine with it).
It depends on the game with frame gen. The latency is pretty bad in Remnant II with frame gen turned on.
 
Yeah, it's highly game dependant; some games it introduces more artifacting and is more distracting, and others it absolutely improves image quality (and of course performance) overall. In my experience it's generally the latter so I usually enable it if I can't hit my monitor's native resolution and refresh rate without it. But even then sometimes I'll enable it anyways because it actually does improve aliasing better than other AA methods and otherwise I see no artifacts or IQ loss with it.

So generally I'll toggle it on and off in a few scenes in the game and then assess which way the game looks and plays better for me. It's a nice option to have regardless, despite many insisting it's a crutch and a detriment to games overall.
 
Yeah, it's highly game dependant; some games it introduces more artifacting and is more distracting, and others it absolutely improves image quality (and of course performance) overall. In my experience it's generally the latter so I usually enable it if I can't hit my monitor's native resolution and refresh rate without it. But even then sometimes I'll enable it anyways because it actually does improve aliasing better than other AA methods and otherwise I see no artifacts or IQ loss with it.

So generally I'll toggle it on and off in a few scenes in the game and then assess which way the game looks and plays better for me. It's a nice option to have regardless, despite many insisting it's a crutch and a detriment to games overall.
I'd say you can't deny it's a crutch (especially early on) for RT stuff. No one wants to play stuff @ 4K sub 30 fps when they spent 1k+ on a GPU.
 
I'd say you can't deny it's a crutch (especially early on) for RT stuff. No one wants to play stuff @ 4K sub 30 fps when they spent 1k+ on a GPU.
Well yeah, and DLSS 1.x was absolute garbage too in that it was always a considerable IQ downgrade.

It should be obvious I'm only speaking to current DLSS 2+ implementations. Though I don't have any experience with the frame gen stuff yet.
 
It's not a binary yes/no answer. DLSS is another tool in the tool box to help hit image quality and performance targets.
If you are capable of running at your native resolution while hitting all the IQ targets that you want (you could be satisfied with all low settings! it doesn't just mean cranking everything up), while also hitting your performance targets (whatever average fps you want) then you don't need to use DLSS.

However if you want to run a specific image quality setting and you're not getting the performance you want, then DLSS can be used as a tool to help hit a performance target before having to compromise on turning down some other setting. Most notably resolution. Resolution still probably takes the most amount of performance out of every image quality setting. So instead of lowering down from native resolution and staying there, DLSS brings back some of that IQ back through image up-scaling techniques.

I think most would prefer to turn DLSS on instead of turning other settings down. In that regard it's a very useful tool again for hitting your image quality and performance targets through a reasonable compromise. But certainly you don't necessarily want or need to turn it on if you've maxed the games settings out and you're hitting a frame rate higher than your monitor is capable of displaying.
 
It's not a binary yes/no answer.
Yes -- there is a large factor of personal preference involved too.

The large-ratio frame rate increases (made possible by DLSS3+) is an alternative motion-blur-reduction substitute to flickery BFI/strobe/ULMB/DyAc -- at least for casual gaming.

If you use an OLED display, it's becoming increasingly tempting to turn DLSS on than with LCD.

This is because 120-vs-240fps has a true 2x blur reduction benefit on OLED, while LCD GTG pixel response throttles the framerate-based motion blur reduction.

The motion quality improvement of the 3x+ frame rates that DLSS 3 affords, significantly outweighs the DLSS drawbacks, if you're playing on the 4000 series on an OLED display, especially for casual games where DLSS-derived latencies may not be a problem.

All that nice GPU detail is all lost during fast motion of display motion blur (extra motion blur caused by low frame rates, even if "only 100fps") If you're always moving around fast in a casual game (especially a crosshairs-less game, or if you're staring all over the screen willy-nilly while en-motion).

You can use a motion blur reduction strobe backlight or BFI, which is another fantastic option, but some people want to reduce motion blur flickerlessly (more frame rate), without the color loss, without the flicker, and without brightness loss. Framerate-based motion blur reduction (ergonomic non-flicker-based) is the latest Blur Busting feat nowadays, for people who hate display motion blur.

240fps has 1/3rd the motion blur of 80fps, assuming GtG=0 and eyes=tracking a pan/scroll/turning/moving object. That kind of blur differential is quite noticeable on fast-GtG displays like any of the new 240Hz OLEDs. In this case, for you specifically, turning DLSS on can actually be a (net) detail increaser than detail reducer.
 
Last edited:
Back
Top