More DLSS...

but why? I mean yes 120fps is better than 60fps, but half the reason is the responsiveness offered by 120fps.
You may only get a bit better latency, but you'll have more frames to see where someone is and to reduce motion blur. That's a huge thing and part of why higher framerate is desired normally.
 
You may only get a bit better latency, but you'll have more frames to see where someone is and to reduce motion blur. That's a huge thing and part of why higher framerate is desired normally.
Sorta.... The latency will be higher than DLSS without Frame Generation and because it artifacts those moment of clarity are marred by errors. Errors you might flick to aim at vs reactions to actual threats.

But I think those are the wrong type of games for Frame Generation. In those games you want really clean and high FPS. You might even bulk at DLSS because of temporal artifacts, never mind frame generation artifacts. I think it could be amazing for VR and I think it's awesome sauce for SP games. Or really any game where the extra smoothness might just enhance the immersion and the artifacts shouldn't distract enough to matter and cause the player mistakes. Like Cyberpunk 2077, Atomic Heart, HITMAN3, Dying Light 2, F1 22, Dakar Desert Rally etc etc etc
 
God I wish i could tell the difference of 17ms of extra latency. You could tack on 100ms I doubt I’d notice. I’d take extra free fps anyday. My eyes can still see stuttering…

Ever do the response time test on your phone? It’s almost depressing.
 
DLSS 3 probably helpful in scenarios where it can boost 60 fps to 120 fps
The main problem is that it creates "fake" frames. The lower the native FPS, the shittier experience will be with DLSS3. 120 FPS with 30 FPS latency doesn't sound great.
 
One render frame, one generated (fake) frame. High frequency textures I am interested in like textures of sand, rocks and so on. Will one frame be clear, rendered and the next fake frame a blurry texture mess and then a proper render frame? A strobe effect? In motion.

The other is transparency textures where objects hidden become exposed from one frame to the next.

Increase FPS, or cheating on poor quality output would be unacceptable. Just have to see how good or not this is.

What is the real performance increase between a 3090 and a 4090 at native resolution?
 
What is the real performance increase between a 3090 and a 4090 at native resolution?
I think We have very little idea yet, only native rendering comp I saw was:
https://wccftech.com/nvidia-geforce...k-2077-dlss-3-cuts-gpu-wattage-by-25-percent/

And it was versus a 3090Ti so would be a little more but it showed 59.9fps average instead of 35.8 or 67% faster.

Rumored leaked Cuda benchmark, that was an good predictor of the rendering raw power on Ampere (will to see if it will be the case with the architecture change):
https://videocardz.com/newz/geforce-rtx-4090-is-60-faster-than-rtx-3090-ti-in-geekbench-cuda-test
Show a 60% boost on the cuda core performance over a 3090TI and 75% over a 3090, the better result on Cyberpunk possibly explained because it was with Raytracing on and Raytracing performance increase seem to be massive.
I.E. if they say has much has twice the performance, I would imagine 40-60% for the average case, 66% for the RT heavy ones and near 92% for a couple.
 
  • Like
Reactions: noko
like this
I think We have very little idea yet, only native rendering comp I saw was:
https://wccftech.com/nvidia-geforce...k-2077-dlss-3-cuts-gpu-wattage-by-25-percent/

And it was versus a 3090Ti so would be a little more but it showed 59.9fps average instead of 35.8 or 67% faster.

Rumored leaked Cuda benchmark, that was an good predictor of the rendering raw power on Ampere (will to see if it will be the case with the architecture change):
https://videocardz.com/newz/geforce-rtx-4090-is-60-faster-than-rtx-3090-ti-in-geekbench-cuda-test
Show a 60% boost on the cuda core performance over a 3090TI and 75% over a 3090, the better result on Cyberpunk possibly explained because it was with Raytracing on and Raytracing performance increase seem to be massive.
I.E. if they say has much has twice the performance, I would imagine 40-60% for the average case, 66% for the RT heavy ones and near 92% for a couple.
Thanks for the links. Those are substantial improvements abeit more power, way more transistors. Plus some extra features such as DLSS 3.0 in titles that support it.

Makes me see why Nvidia went more into the unique DLSS 3 numbers which really skews the user to improvements that may not he present in most games. AMD RNDA 3 may have an actual larger improvement, just have to wait and see.
 
Those are substantial improvements abeit more power
Very similar power (for the 67% faster in native gain) it was against a MSI RTX 3090 TI SUPRIM X running at 454.2W vs 461.3 watt for the 4090, at that point virtually the same.

But the 3090TI was already the more power scenario.

Makes me see why Nvidia went more into the unique DLSS 3 numbers which really skews the user to improvements that may not he present in most games
I think it was more for the 4080 class that it was needed (those are not native but some unclear DLSS performance mode and super frame generation when available):

21100644451l.jpg


the 4080 12 GB getting beat by a 3090 TI would not be an issue at if
1) it was a much cheaper $750 card
2) The 3090-3090TI were still extremely expensive card and those offer didn't exist: https://sellout.woot.com/offers/zot...-3090-trinity-oc-open-box-1?ref=mwj_cd_deal_4

4090 seem your perfectly fine and normal gen on gen increase, with in the inflation context a reasonable $100 over the 3090 non TI $1,499 MSRP and being compared to the $2000 MSRP of not so long ago 3090TI.

To sell the 4080x, having the right part (Warhammer to the right) now you really need that DLSS 3 to deliver for those price points and that RT overdrive-path tracing to have a game that make it worthit.
 
  • Like
Reactions: noko
like this
Without a doubt, the 4080 12gb presents poor value from a price and specifications angle and Nvidia could definitely have done better. On the other hand, having performance that seems to trade blows with a 3090 Ti with specs like a 192bit memory bus and a smaller AD104 chip with 295W TBP at least shows there is still generational progress being made. I guess, if Nvidia and AIBs didn't have to get rid of excess 30-series inventory, the 4080 12GB would have been the 4070 it was meant to be and priced accordingly for an x70 class GPU (or maybe a bit higher as it has been trending over the years).
 
Optimum Tech on YouTube said in his review that DLSS 3.0 added latency was something that he could feel using a mouse and KB testing Cyberpunk. He is a big FPS competitive guy so maybe more sensitive than others.
 
Optimum Tech on YouTube said in his review that DLSS 3.0 added latency was something that he could feel using a mouse and KB testing Cyberpunk. He is a big FPS competitive guy so maybe more sensitive than others.

Sounds like a good way to get clicks on his video. Doesn't mean it's true. I'm sure that he could "feel" it, like the people who "hear" the difference when using expensive speaker cables.
 
Sounds like a good way to get clicks on his video. Doesn't mean it's true. I'm sure that he could "feel" it, like the people who "hear" the difference when using expensive speaker cables.
It was a small section when he talked about it and I think he has maybe one graph there. A lot of speculation still on the tech and it’s new so seeing how it plays out is interesting.
 
The last DF video on DLSS 3 is great. The tech is cool and some people will like it as it is I am sure, but it's far from perfect and not suitable for everything and everyone. Sensitivity to the issues it can cause will vary from person to person and game to game.

It's a huge milestone and frame amplification is the future and has to start somewhere, but I can't see myself using it in its current iteration, plus not many games I play will even support it.

Thankfully this new gen has fantastic raster performance and efficiency too - but I think I'll wait for next gen anyway (or at least for more reasonable prices and TDPs).
 
About to watch the new DF video on DLSS 3. Let's see how big of a mistake I just made with my $1600...
 
Fake Frames or Big Gains? - Nvidia DLSS 3 Analyzed


ha! I read techspot daily and I came to this thread just to post the above. Its a fantastic read, a deep dive into DLSS 3.0. For those of us who like to read instead of watch youtube, here is the article, and the summary:

https://www.techspot.com/article/2546-dlss-3/

In short, DLSS 3 will be of benefit to you provided you meet a few criteria:

  1. Before enabling frame generation, you need to already be rendering the game at a decent frame rate, in the 100-120 FPS range.
  2. You'll need to have a high refresh rate display, ideally 240 Hz or higher.
  3. You'll need to be playing a game that is slower paced and not latency sensitive, like a single-player title.
If all of these factors are combined and you have the right set of conditions, DLSS 3 can improve the visual smoothness of gameplay, and while the quality of generated frames is typically less than that of traditionally rendered frames, the way they are interlaced into the output makes artifacts hard to notice.

DLSS 3 was a real joy to use in Microsoft Flight Simulator most of the time and we suspect you'll find the same in other games that are not latency sensitive or are CPU limited. As DLSS 3 simply generates its own frames without game engine input, it can avoid the CPU bottleneck and deliver a smoother experience.

However, there are limitations to DLSS 3. The biggest is that it hurts latency when enabled. This means that while frame rates go up, the game doesn't feel more responsive, and in a worst case scenario, will feel sluggish because of DLSS 3. A game run with DLSS 3 at 60 FPS feels like a 30 FPS game in its responsiveness, which is not great and makes the technology unsuitable for transforming slower GPUs into 60 FPS performers

Because DLSS 3 provides no latency benefit, there's no reason to run it above your monitor's maximum refresh rate. This is why we strongly recommend a 240Hz display or greater with adaptive sync support. There's also no way to cap DLSS 3 inside the maximum refresh rate.

All of this brings us to the major question: is DLSS 3 a key selling point for RTX 40 series GPUs?

Because its use cases are more constrained, we don't think it's as much of a selling point as DLSS 2.0. On a powerful GPU like the RTX 4090 there are many cases where you don't need DLSS 3 at all because frame rates are already so high, and we have question marks over how useful it will be on entry-level products, like an "RTX 4050" if that ever comes to market.

The sweet spot where buyers will benefit the most will be around RTX 4070 levels of performance, where good frame rates are achievable and DLSS 3 can provide a boost into the 200 FPS range. But we don't think it's a must-have feature, nor would we pay a premium to access DLSS 3.

The main benefit of upgrading to Ada Lovelace is its higher performance, especially for ray tracing. DLSS 3 is a nice bonus and adds to the strong feature set of Nvidia GPUs that include DLSS 2, NVENC and Reflex – together those things are hard to beat.

To make DLSS 3 a killer feature Nvidia needs to work hard on several areas. Reducing visual artifacts in generated frames will help the experience at more modest frame rates, especially if the algorithm can be improved to handle UI elements better. Adding support for Vsync and frame limits is also essential. However, we're uncertain if the main issue of latency can ever be solved given DLSS 3 fundamentally needs access to a future frame to slot in its generated frame. If Nvidia can somehow fix latency that would be amazing and elevate DLSS 3 to killer feature status, but we wouldn't hold our breath for that to happen.
 
So, DLSS3 is practically useless as many have speculated. You either have a monster of a GPU where you are either already ok at native or you're better of with DLSS2. More frequent image artifacts and having a 60 FPS game feel like 30 FPS due to latency increase are a hard pill to swallow.
 
So, DLSS3 is practically useless as many have speculated. You either have a monster of a GPU where you are either already ok at native or you're better of with DLSS2. More frequent image artifacts and having a 60 FPS game feel like 30 FPS due to latency increase are a hard pill to swallow.

but not every GPU is going to be a 4090...DLSS 3 might be more viable in the 4060 or other lower end cards
 
but not every GPU is going to be a 4090...DLSS 3 might be more viable in the 4060 or other lower end cards
False -- according to the article above

"Because [DLSS 3.0] use cases are more constrained, we don't think it's as much of a selling point as DLSS 2.0. On a powerful GPU like the RTX 4090 there are many cases where you don't need DLSS 3 at all because frame rates are already so high, and we have question marks over how useful it will be on entry-level products, like an "RTX 4050" if that ever comes to market.

The sweet spot where buyers will benefit the most will be around RTX 4070 levels of performance, where good frame rates are achievable and DLSS 3 can provide a boost into the 200 FPS range. But we don't think it's a must-have feature, nor would we pay a premium to access DLSS 3."
 
but not every GPU is going to be a 4090...DLSS 3 might be more viable in the 4060 or other lower end cards
No, that's the thing. Only higher end cards that already get >120 FPS have the performance necessary to offset the increase in latency, and then only if the game isn't one of the more artifacting ones, and if you have a >200 Hz monitor.
 
So far, DLSS 3.0 is like tax cuts to the rich. They dont need them, and they aren't going to help you stop being poor in the first place
 
the Hardware Unboxed guys were also notoriously anti ray tracing for a long time...they didn't really cover it or do a lot of videos on it
They were not anti ray tracing, they were against using it as a meaningful consideration during the first RTX gen which had unusable RT performance. When the performance became usable, the covered it appropriately.

When DLSS1 was shit, they thought it was shit. When DLSS2 was good, they thought it was good.

How is any of that a bad look?
 
I would think OLED monitors where the pixel can change much faster frame to frame over LCD monitors could be rather harsh with one cleaner rendered frame and one crap frame. So far DLSS 3 does not look like something I would use, all the artifacts, particularly edges, crawling shimmering textures, UI corruption would be somewhat distracting, latency and slugginess would depend on the game. One use case I see would be Microsoft Flight Simulator in VR. I also wonder how many would be adversely affected as in fatigue, headaches, nausea with the more harsher changes from one frame to the next where subconsciously the brain is being hammered and interpreting as well the motion.
 
the Hardware Unboxed guys were also notoriously anti ray tracing for a long time...they didn't really cover it or do a lot of videos on it
I would say for a long time RT was virtually useless in games and degraded the gaming experience so was a right call.
 
So, DLSS3 is practically useless as many have speculated. You either have a monster of a GPU where you are either already ok at native or you're better of with DLSS2. More frequent image artifacts and having a 60 FPS game feel like 30 FPS due to latency increase are a hard pill to swallow.
No, that's the thing. Only higher end cards that already get >120 FPS have the performance necessary to offset the increase in latency, and then only if the game isn't one of the more artifacting ones, and if you have a >200 Hz monitor.
I feel this is a bit of a limited view, one will be able to change the game setting to reach an high FPS (120 fps with dlss on) with a 4060-4070 video card to max out their 1440p 240 hz monitor. Do not forget that high frame rate monitor people tend to be 1080p-1440p people, not mending playing at the very highest setting on the latest AAA games and playing game that goes fasts.

GPU does not determine your FPS, the game you play at what setting and a GPU does.

When DLSS1 was shit, they thought it was shit. When DLSS2 was good, they thought it was good.
This some can build some bias over time over Nvidia tactic and built a pro AMD mindshare audience, but even the MooreLawsIsDead recommand 12100 to 12600k to people before Zen4 launch, got a 3090 for is work and will try to buy a 4090 and Hardware Unboxed do show the merit of DLSS now that it work well.

No doubt that they will test raytracing if it ever become something they consider a must look for buyers and their coverage of DLSS 3 did seem fully fine, showing when it work well, when it work not that well.


DLSS 3 seem to me a work in progress affair that could get good quick with just how ridiculous fast AI progress are now and how amazingly powerful in house compute to machine learn must be at Nvidia, it could work well by next spring.

From what we saw in no way they can go around justify the pricing of the 4080 12gig because of that feature and need to see how well it work on a 3090TI.
 
I would say for a long time RT was virtually useless in games and degraded the gaming experience so was a right call.
This.

To me, RT is pretty similar to AA/AF when they were new. Remember those 800x600, 1024p benchmarks of 2xAA, 4xAA, and 8xAA? Like RT, it would bring graphic cards to their knees. And while everyone thought it looked great, it wasnt worth the performance cost. Now AA/AF are basically free.

Let me know in a few generations when native RT is free with no shenanigans
 
I feel this is a bit of a limited view, one will be able to change the game setting to reach an high FPS (120 fps with dlss on) with a 4060-4070 video card to max out their 1440p 240 hz monitor. Do not forget that high frame rate monitor people tend to be 1080p-1440p people, not mending playing at the very highest setting on the latest AAA games and playing game that goes fasts.
They also tend to be people who would benefit more from DLSS2's increased framerate with lower latency, though, i.e. FPS players.
 
They also tend to be people who would benefit more from DLSS2's increased framerate with lower latency, though, i.e. FPS players.
Yes the latency being in the other direction making it the market-target way less obvious than otherwise.

That said, if it was not some paid promo lying but there was someone that did have issue with 30fps type of latency playing at 8k using dlss3 to hit 60 fps, many game like Flight sim could work well, latency is quite different to people to people, many people were playing on bluetooth controller on television with horrible latency on 25-30 fps console, i.e. peoples were playing with over 100ms even reaching 150 ms and it was almost if not the most popular way to game. But does those people care about higher frame rate ?

Like I said, seem good enough to be good one day but not at the moment, it would require imo a DLSS2 breakthrough type, which would be able to generate a frame from the previous 3 without needing the next one to have virtually 0 latency impact (and removing the quirks, i.e. detecting when the change is too large to extrapolate, UI stuff, etc...) or just when 240 fps monitor will be more the norm and 500-1000 fps for the high frame people, then it will just make sense, has you would target 130-140 fps without frame generation on and have 240 fps smooth display with a low 120 fps like latency.

But the main point was that the idea that gaming at 120 fps at 1080 or 1440p would exclude a 4060-4070 did seem an extremelly limited idea to me, I think what happened to people is the follow
) If you buy a 4090 it make sense only to play the hardest game to run at 4kRT on or 400 fps competitive title
) In those 2 scenario above DLSS 3 on a 4060 or 4070 will make no sense.

Yes, but that exclude what people buying 4060-4070 could and tend to do, not play at 4K the latest AAA games everything at ultra, the mental shift of going down class but wanting 120-144 fps not mending 60 fps latency or 240 fps not mending 120 did not happen.
 
Let me guess, 3 years from now DLSS 3.1 or 3.2 will be hailed as amazing tech?

"DLSS is Dead" - HUB August 2019

View attachment 518140

If Nvidia improves DLSS 3 then, yes, maybe it will be amazing tech in the future.

DLSS 1 was garage, Nvidia worked on improving it and making it better. The improvements resulted in DLSS 2.0 coming out and being great tech that's worth using. I'm not sure exactly what point you're trying to make here. Is reevaluating tech after massive improvements have been made suddenly a bad thing? Should a reviewer only have one opinion for all time and never update it when things change?
 
Yes the latency being in the other direction making it the market-target way less obvious than otherwise.

That said, if it was not some paid promo lying but there was someone that did have issue with 30fps type of latency playing at 8k using dlss3 to hit 60 fps, many game like Flight sim could work well, latency is quite different to people to people, many people were playing on bluetooth controller on television with horrible latency on 25-30 fps console, i.e. peoples were playing with over 100ms even reaching 150 ms and it was almost if not the most popular way to game. But does those people care about higher frame rate ?
Hitting 60 FPS with DLSS3 would mean 15-20 FPS latency, though. Not sure if even "don't care about latency" people would find that enjoying.
 
They were not anti ray tracing, they were against using it as a meaningful consideration during the first RTX gen which had unusable RT performance. When the performance became usable, the covered it appropriately.

When DLSS1 was shit, they thought it was shit. When DLSS2 was good, they thought it was good.

How is any of that a bad look?

I never said it was a bad look...Hardware Unboxed is one of my favorite tech channels...that being said they have done videos talking about their lack of RT coverage in the early going saying that they don't think it offers anything amazing from a visual standpoint...but being that they're a tech channel they should have done more videos on it regardless of their personal preference for the tech...you don't just make videos on tech you like
 
I never said it was a bad look...Hardware Unboxed is one of my favorite tech channels...that being said they have done videos talking about their lack of RT coverage in the early going saying that they don't think it offers anything amazing from a visual standpoint...but being that they're a tech channel they should have done more videos on it regardless of their personal preference for the tech...you don't just make videos on tech you like
They would only be Nvidia's influencers then. They saw the technology, saw the performance, saw the available games, and concluded that, at the time, it wasn't important enough to cover further. And it wasn't.
 
They would only be Nvidia's influencers then. They saw the technology, saw the performance, saw the available games, and concluded that, at the time, it wasn't important enough to cover further. And it wasn't.

there were a lot of people playing the early RT games...it was worth it to cover it...lots of other sites did
 
there were a lot of people playing the early RT games...it was worth it to cover it...lots of other sites did
I'd agree. When I got my 2080Ti, I turned RT on in every game I had that supported it. The reason I PC game is for the graphics. Even now with my 3090, I turn RT on in everything, including MP games.
 
Back
Top