Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

Status
Not open for further replies.

1_rick

Supreme [H]ardness
Joined
Feb 7, 2017
Messages
5,418
https://www.tomshardware.com/news/n...olutio-gaming-thing-of-past-dlss-here-to-stay

During their discussion with Digital Foundry's Alex Battaglia and PCMR's Pedro Valadas, Bryan Catanzaro — Nvidia's VP of Applied Deep Learning Research — stated that native resolution gaming is no longer the best solution for maximum graphical fidelity. Catanzaro went on to say that the gaming graphics industry is headed toward more significant reliance on AI image reconstruction and AI-based graphics rendering in the future.
Catanzaro's statement was in response to a question from Valadas regarding DLSS and whether Nvidia planned to prioritize native resolution performance in its GPUs. Catanzaro pointed out that improving graphics fidelity through sheer brute force is no longer an ideal solution, due to the fact that "Moore's Law is dead."

In other words, they've given up on improving actual performance.
 
DLSS will always be a compromise.

Not going to lie, Nvidia's DLAA filter is pretty awesome, and I would like to see more games offer utilizing it at native resolution, as it is by far the best AA I have used from a balance of performance, effectiveness at removing AA, and sharpness perspective.

Scaling the resolution up - however - is always a compromise. I don't for a second buy their claim that DLSS looks better than native, because - well - I have eyes. I have seen what it looks like in game.

This can go one of two ways though:

A) Developers can go the way of Bethesda and make games like Starfield which look mediocre, but still demand ourageous system resources, and just put scaling on it as a band-aid,

--OR--

B) They could actually make games that have higher polygon counts, more RT and really look quite awesome, but at the time of launch the hardware to run them natively doesn't exist yet. It would be sortof as if when Crysis launched, you could run the outrageously heavy game at scaling on lesser hardware. It wouldn't be as good as native, but you would understand why, because the game really looks great, and you would be willing to put up with it.


Unfortunately doing a shit job, and using scaling as a band-aid seems more up the alley of most developers, so unfortunately that is probably what we'll get.
 
https://www.tomshardware.com/news/n...olutio-gaming-thing-of-past-dlss-here-to-stay



In other words, they've given up on improving actual performance.
Can we really blame them when Devs keep releasing games with PS4 graphics and worse performance over and over?

View: https://youtu.be/Qv9SLtojkTU?t=1933

This video is bookmarked at the chapter titled 'Will Nvidia still focus on performance without DLSS at native resolutions?'

Bunch of stuff said that will probably make a few people bang their heads against their monitor over and over and over in rage so just be sure you want to watch it before you click :)

I have no doubt that they will continue to lead the pack in performance regardless of dlss on or off. But I can't blame them for pushing dlss so heavily when devs keep releasing games with dog shit performance and last gen graphics. We have people touting BS claims like 7900XTX keeping pace and even outperforming 4090 because a lot of these games seem to be hampering performance on the competitors HW. It's fucking wild man.
 
I have no doubt that they will continue to lead the pack in performance regardless of dlss on or off. But I can't blame them for pushing dlss so heavily when devs keep releasing games with dog shit performance and last gen graphics. We have people touting BS claims like 7900XTX keeping pace and even outperforming 4090 because a lot of these games seem to be hampering performance on the competitors HW. It's fucking wild man.

If Nvidia is Ngreedia wouldn't they still want the crown because that's the greedy thing to do right? 🤔
 
If Nvidia is Ngreedia wouldn't they still want the crown because that's the greedy thing to do right? 🤔
Yes, and don't forget that they hold the industry back by proving the best upscaling solution which also makes them Ngreedier. :coffee:
Why put effort into further enhancing hardware performance when we can just work on software upscalers now?
-Nvidia
Why put effort into enhancing hardware performance when the competition can't keep up. We should just work on making our upscaler even better than it already is.
-Nvidia

There, fixed that for you.
 
And then when a game doesn't feature the software that marketing said would change everything, just make up stories to whip everyone up in a fit.
Now if the parties named would come clean and deny it instead of beating around the bush. That alone should tell you everything you need to know. Hell, this forum has gone nuclear on Nvidia for way less.
 
So you're saying you believe a soulless corporation that doesn't deny it? That's a big yikes.
So you’re saying you need zero proof to create a lynch mob and stir up hate against a corporation you hate (but also happened to not have done what you said they did) so you can prop up a different soulless corporation that you happen to love more?
 
Last edited:
Yes, and don't forget that they hold the industry back by proving the best upscaling solution which also makes them Ngreedier. :coffee:

Well Linus Torvalds doesn't like them so that's all I need to hear thank you very much BAAAAAAAAAAAAH!!!!!!!!!!!!!!!! now feed me grains

1695257915072.png
 
I don’t like it but I get it.
5 to 3nm yields about 10-15% performance improvements at 20-25% price increase.

Something has to give.

The new tools like Unreal has been putting out allow devs to do more a lot faster but that automation comes at a cost.
Ray Tracing and such there can be done fast, consistently, and comparatively cheap (man power wise), but computationally… ouch.

Based on the leaked specs of the 5090 and the current wafer/component costs that beast could be around $2500. That might get you 60fps native at 4K in the upcoming cyberpunk expansion.

I mean if a game is using 4K textures to render at 1080p to upscale to 1440p it won’t look bad.

This levels the playing field, it’s actually a pretty tough job finding how to turn things off to allow for potato modes and not break things. This gives the average player better options than potato mode while removing work from developers plates which should result in better products.

Imperfect solution to a shitty problem, but developers visions are exceeding what the silicon can supply at a price we are willing to pay.
 
Ray Tracing and such there can be done fast, consistently, and comparatively cheap (man power wise), but computationally… ouch.

This was my understanding as well. Ray Tracing seems to be the way of the future because it's much less intensive to implement for the developer, as I understand it. We can sort of also view AI image processing as the next effective evolution of the GPU space in general, rather than continuing to try to brute force it.

And that's kind of a problem.

AMD's highest end card (GPU) at the moment basically has the ray tracing performance of a 3090 at the absolute highest. Their upscaling solution is also worse. Their other general software features are worse. They have worse AI performance and very little driver/software support for it. Nvidia plays a ridiculously long game with some of their products. It shows because my 2080, a 5 year old card, is sitting over in another room cranking out images under Stable Diffusion under Linux, and thanks to Xformers it's going pretty reasonably fast and the VRAM is kept to the minimum needed.

It's a problem because I don't see how AMD can really catch up. If this trend keeps up, Nvidia is going to increase their chokehold on the GPU space more and more as time goes on (not to mention the professional AI space). That might even extend to the low end or budget space, if DLSS just keeps advancing in strides. I don't know what's going to happen, but I think either way it's going to be terrible for the consumer. I really want AMD to pull a turnaround like they did in the CPU space, but I don't know how it can happen. My last AMD card was a 5850HD. At least it played Crysis. I guess AMD is at least inside a decent amount of consoles? Last I checked?
 
This was my understanding as well. Ray Tracing seems to be the way of the future because it's much less intensive to implement for the developer, as I understand it. We can sort of also view AI image processing as the next effective evolution of the GPU space in general, rather than continuing to try to brute force it.

And that's kind of a problem.

AMD's highest end card (GPU) at the moment basically has the ray tracing performance of a 3090 at the absolute highest. Their upscaling solution is also worse. Their other general software features are worse. They have worse AI performance and very little driver/software support for it. Nvidia plays a ridiculously long game with some of their products. It shows because my 2080, a 5 year old card, is sitting over in another room cranking out images under Stable Diffusion under Linux, and thanks to Xformers it's going pretty reasonably fast and the VRAM is kept to the minimum needed.

It's a problem because I don't see how AMD can really catch up. If this trend keeps up, Nvidia is going to increase their chokehold on the GPU space more and more as time goes on (not to mention the professional AI space). That might even extend to the low end or budget space, if DLSS just keeps advancing in strides. I don't know what's going to happen, but I think either way it's going to be terrible for the consumer. I really want AMD to pull a turnaround like they did in the CPU space, but I don't know how it can happen. My last AMD card was a 5850HD. At least it played Crysis. I guess AMD is at least inside a decent amount of consoles? Last I checked?

AMD also has to stave off Intel Arc and also XeSS at the low end
 
This was my understanding as well. Ray Tracing seems to be the way of the future because it's much less intensive to implement for the developer, as I understand it. We can sort of also view AI image processing as the next effective evolution of the GPU space in general, rather than continuing to try to brute force it.

And that's kind of a problem.

AMD's highest end card (GPU) at the moment basically has the ray tracing performance of a 3090 at the absolute highest. Their upscaling solution is also worse. Their other general software features are worse. They have worse AI performance and very little driver/software support for it. Nvidia plays a ridiculously long game with some of their products. It shows because my 2080, a 5 year old card, is sitting over in another room cranking out images under Stable Diffusion under Linux, and thanks to Xformers it's going pretty reasonably fast and the VRAM is kept to the minimum needed.

It's a problem because I don't see how AMD can really catch up. If this trend keeps up, Nvidia is going to increase their chokehold on the GPU space more and more as time goes on (not to mention the professional AI space). That might even extend to the low end or budget space, if DLSS just keeps advancing in strides. I don't know what's going to happen, but I think either way it's going to be terrible for the consumer. I really want AMD to pull a turnaround like they did in the CPU space, but I don't know how it can happen. My last AMD card was a 5850HD. At least it played Crysis.
Ray tracing performance I’m not terribly worried about currently. Even for the AMD stack it’s at a “good enough” stage. They might be a generation behind but it was really only the 3000 series that finally made it viable and most users are still on the 2000 and 3000 series anyways because the pricing on the 4000 series is a barf bowl of nasty.
By the time it becomes non optional things will have mellowed and it should be closer to parity and the current 7000 or 4000 series parts will be a two if not three generation old part.
The 7000 parts did a big jump over the 6000 for RT performance, I don’t expect as big of one next but they will keep at it.
The game is a lot closer if you consider target refresh rates as the limiting factor.
Most mid range displays out there for 1080p or 1440p are still in the 75hz range.
I have to dig it out but while the average PC gamer goes 2 cycles between CPU/GPU updates they might go 4 before replacing the display.
 
Ray tracing performance I’m not terribly worried about currently. Even for the AMD stack it’s at a “good enough” stage. They might be a generation behind but it was really only the 3000 series that finally made it viable and most users are still on the 2000 and 3000 series anyways because the pricing on the 4000 series is a barf bowl of nasty.
By the time it becomes non optional things will have mellowed and it should be closer to parity and the current 7000 or 4000 series parts will be a two if not three generation old part.
The 7000 parts did a big jump over the 6000 for RT performance, I don’t expect as big of one next but they will keep at it.
The game is a lot closer if you consider target refresh rates as the limiting factor.
Most mid range displays out there for 1080p or 1440p are still in the 75hz range.
I have to dig it out but while the average PC gamer goes 2 cycles between CPU/GPU updates they might go 4 before replacing the display.


It's not just ray tracing though

They have to fight like 4 battles as pointed out (ray tracing, upscaling, AI, software)

Against both Nvidia at the high end and now Intel at the low end

Nvidia can bleed market share and revenue and pour globs of money into resources and development if need be, Intel doesn't even exist here it's what they're fighting for they have nothing to lose so go for broke, AMD needs to start getting their shit together all jokes aside seriously
 
If AMD wants to win the AI market on the consumer side, all they need to do is release cards with more VRAM. That's it. 32 GB, 48 GB, even 64 GB or more, and immediately, raw performance would take second-place to the ability to actually load in huge data models.
 
It's not just ray tracing though

They have to fight like 4 battles as pointed out (ray tracing, upscaling, AI, software)

Against both Nvidia at the high end and now Intel at the low end

Nvidia can bleed market share and revenue and pour globs of money into resources and development if need be, Intel doesn't even exist here it's what they're fighting for they have nothing to lose so go for broke, AMD needs to start getting their shit together all jokes aside seriously
More like a baseline but I don’t seem them getting their shit pushed in yet. FSR3 will be the make or break there. But as long as AMD is in the consoles that’s the baseline everything else is extra. Microsoft is working hard on unifying the memory registers for windows to make it more console like to simplify porting. But being only 1 generation behind when most users wait two or three generations to upgrade isn’t really a big deal.
For users in the 3000 series looking at the 4000 or the 7000 most things are a side grade and not worth it. But if you are on an RX5000 or a RTX 2000 or god forbid older then either is a large upgrade. Hell I have friends who until Diablo 4 and BG3 were happily gaming away on a Gen5 Intel with a GT970 or maybe a Gen6 and a 1060.
Anything they buy at this stage is massive and for them and their budgets they are all the same. If $600 CAD is the GPU budget and the 27” 1080p display tops out at 120hz then they are all about equal.

It is very easy to stand at the top and look down at the AMD offerings and call them shit. But when you are standing at the bottom looking up then both teams are offering things worth looking at, more so AMD than Nvidia though, I expect some bigger price cuts come middle Oct for the 4060 and 4070 parts.
 
As someone who sees DLSS as nothing but a success, I find it's difficult to fault their logic. If you're targeting a frame rate, DLSS will get you many more bells and whistle graphically than native. There's simply no argument about it. Some might quibble about if DLSS at otherwise identical settings looks as good as native, but these people are sorely missing the point.
 
It is very easy to stand at the top and look down at the AMD offerings and call them shit. But when you are standing at the bottom looking up then both teams are offering things worth looking at, more so AMD than Nvidia though, I expect some bigger price cuts come middle Oct for the 4060 and 4070 parts.

I think right now it's fine because Nvidia's pricing is so bad on the budget parts. But give them a couple of years of development and pouring even more of their budget into DLSS... I'm worried that they're going to be able to take the budget market without much of a fight.

Then again, you've got a point. The lowest common denominator of gaming (and I don't mean this in a derogatory way necessarily; everyone had times when they were getting by with a budget GPU for 5 years as a college student or what have you) exists. Even if Nvidia becomes objectively the better choice in every way, they won't necessarily care to upgrade. The problem starts happening if consoles want Nvidia more because Nvidia would allow them to push higher fidelity for cheaper via DLSS and their superior software solutions. Luckily, AMD already has their foot in the door in the console space, so at least that's a good thing.
 
As someone who sees DLSS as nothing but a success, I find it's difficult to fault their logic. If you're targeting a frame rate, DLSS will get you many more bells and whistle graphically than native. There's simply no argument about it. Some might quibble about if DLSS at otherwise identical settings looks as good as native, but these people are sorely missing the point.
If you want 120fps and don’t want the game to look like something from a PS2 and don’t have the budget for a $1000+ GPU then it’s a more satisfying option then setting for 60’ish FPS without it.
 
The problem starts happening if consoles want Nvidia more because Nvidia would allow them to push higher fidelity for cheaper via DLSS and their superior software solutions.

We can look at reactions to the Switch 2 (if rumors about it running the Matrix demo with DLSS 3.5? are true) to get a taste/sense of if this might occur/how soon
 
As someone who sees DLSS as nothing but a success, I find it's difficult to fault their logic. If you're targeting a frame rate, DLSS will get you many more bells and whistle graphically than native. There's simply no argument about it. Some might quibble about if DLSS at otherwise identical settings looks as good as native, but these people are sorely missing the point.


Yeah, I'd say for sure that Nvidia's thinking is future-facing, and what they're really getting at here is 8K+ resolutions.
There's just no feasible way to make it easy to run a year 2024+ modern game at 8K or up without DLSS type technology.

I also wouldn't be surprised if upscaling from 4K to 8K or 16K is their true goal. I highly doubt that they're obsessed with short-term results. They are going long, and they want the next decade, the RTX 5000 series and RTX 6000 series, to be 16K compatible.
 
I think right now it's fine because Nvidia's pricing is so bad on the budget parts. But give them a couple of years of development and pouring even more of their budget into DLSS... I'm worried that they're going to be able to take the budget market without much of a fight.

Then again, you've got a point. The lowest common denominator of gaming (and I don't mean this in a derogatory way necessarily; everyone had times when they were getting by with a budget GPU for 5 years as a college student or what have you). Even if Nvidia becomes objectively the better choice in every way, they won't necessarily care to upgrade. The problem starts happening if consoles want Nvidia more because Nvidia would allow them to push higher fidelity for cheaper via DLSS and their superior software solutions. Luckily, AMD already has their foot in the door in the console space, so at least that's a good thing.
Fortunately as with all things, exponential generational improvements aren’t viable. DLSS will plateau, there is only so much software AI magic you can throw at something until the gains are academic or bar graph material only. And that’s when a new must have feature will arrive. So Nvidia can spend the big money for that extra 10% but at some point that 10% is taking us from 500 fps to 550 fps sure it’s great mathematically but not exactly a practical benefit.

Things just need to remain close enough that there isn’t a runaway leader at every price bracket because that’s how we get monopolies. And I don’t care what anybody’s personal stance on the AMD, Intel, Nvidia grab bag of dicks is, we can all agree we’re better with 3 than with 1.
 
Yeah, I'd say for sure that Nvidia's thinking is future-facing, and what they're really getting at here is 8K+ resolutions.
There's just no feasible way to make it easy to run a year 2024+ modern game at 8K or up without DLSS type technology.

I also wouldn't be surprised if upscaling from 4K to 8K or 16K is their true goal. I highly doubt that they're obsessed with short-term results. They are going long, and they want the next decade, the RTX 5000 series and RTX 6000 series, to be 16K compatible.

Also, people will hate this idea but whatever

Give me a secondary pure AI powered card and let it do AI NPC control and dialogue and in game weather systems etc etc etc

PhysX but for AI but for everything

Told you you'd hate it 😎👉👉
 
We can look at reactions to the Switch 2 (if rumors about it running the Matrix demo with DLSS 3.5? are true) to get a taste/sense of if this might occur/how soon
Well based on every credible leak we’ve seen for the past year and change it’s a slightly modified Jetson Orin Nano, which is beast of a 7-15w SoC, quite a few reviews of it kicking around if your interested. But yeah it uses an Ampere based GPU so DLSS and RTX are on the table.

Nintendo seem to be wizards at squeezing every ounce of compute out of a piece of kit. So I have no doubt a handheld doing 720p using a custom trained DLSS upscale to 1080p would look and perform great, and 1080p to 4k upscaling in docked mode would be better than what most consoles currently deliver so.
 
Also, people will hate this idea but whatever

Give me a secondary pure AI powered card and let it do AI NPC control and dialogue and in game weather systems etc etc etc

PhysX but for AI but for everything

Told you you'd hate it 😎👉👉
Well if they are going to keep giving me PCIe slots that will remain forever empty then why the hell not. Still a better alternative to multi GPU render.
 
Well based on every credible leak we’ve seen for the past year and change it’s a slightly modified Jetson Orin Nano, which is beast of a 7-15w SoC, quite a few reviews of it kicking around if your interested. But yeah it uses an Ampere based GPU so DLSS and RTX are on the table

No I mean when they're pumping out games for the Switch 2 let's see what devs think of what they put in vs what they get out - and the audience/customer reaction to it in terms of quality

The Digital Foundry video showed there's possibly gonna be some particle/volumetrics DLSS AI enhanced render/reconstruction - that frees up a little more raster for other things possibly (if like ray reconstruction it's available on all RTX GPUs - and not exclusive to 4K or future 5K RTX GPUs) - HW is only half the solution as shown with all this - it's the actual SW you're running now on it too - and (good at least) devs like to do tricks and whatnot and squeeze blood out of stones with systems - if this stuff can't make it on a handheld and docked for easy comparison and people from both sides go "lol this sucks" we know any chances of the major consoles maybe ever wanting Nvidia probably just got shot the fuck down
 
So Nvidia can spend the big money for that extra 10% but at some point that 10% is taking us from 500 fps to 550 fps sure it’s great mathematically but not exactly a practical benefit.
The more the merrier. We will be seeing frame generation making 1000hz displays possible to drive along with DLSS at some point.

There is a practical difference per researchers who have tested similar in blur reduction and persistence reduction, plus of course input lag. BlurBusters has tested this with gpu and monitor prototypes.
 
  • Like
Reactions: DPI
like this
The more the merrier. We will be seeing frame generation making 1000hz displays possible to drive along with DLSS at some point. There is a practical difference per researchers who have tested similar in blur reduction and persistence reduction, plus of course input lag. BlurBusters has tested this with gpu and monitor prototypes.

Also VR could benefit from a gajillion frames
 
No I mean when they're pumping out games for the Switch 2 let's see what devs think of what they put in vs what they get out - and the audience/customer reaction to it in terms of quality

The Digital Foundry video showed there's possibly gonna be some particle/volumetrics DLSS AI enhanced render/reconstruction - that frees up a little more raster for other things possibly - HW is only half the solution as shown with all this - it's the actual SW you're running now on it too - and (good at least) devs like to do tricks and whatnot and squeeze blood out of stones with systems - if this stuff can't make it on a handheld and docked for easy comparison and people from both sides go "lol this sucks" we know any chances of the major consoles maybe ever wanting Nvidia probably just got shot the fuck down
Nvidia has a big uphill battle for Microsoft and Sony consoles, and no not from a 20 year old perceived slight.
Backwards compatibility is a big impending issue that digital copies and right to repair are forcing forward. Changing to Nvidia there will essentially shatter any hopes of that, and I’m not sure Microsoft or Sony are willing to fight that fight.
The EU is already working to tackle what happens to digital copies of a game when the store is locked out of a newer console. So a PS6 might have to be able to install and run your digital PS5 games, it won’t have to upgrade textures or add features but it would have to work.
So I’m not sure Microsoft or Sony are willing to risk running afoul of that just yet. I’m sure Nvidia could develop a full translation layer or emulator set but that’s one hell of a cost plus item.
 
The more the merrier. We will be seeing frame generation making 1000hz displays possible to drive along with DLSS at some point.

There is a practical difference per researchers who have tested similar in blur reduction and persistence reduction, plus of course input lag. BlurBusters has tested this with gpu and monitor prototypes.
But is 1000hz practically better than 800hz?
Sure 60-120, or 120-240 and yeah those are statistically significant, but 240-480, or 480 to 960? At what point is the gain while statistically significant essentially irrelevant?
 
Nvidia could be doing a lot more with their developer relationships----and could really make people not want to go without their GPUs.

As it stands, DLSS is their most important tech, in terms of affecting gamers. And its not in most games. And why the hell isn't it? They've had DLSS 2.0 out for 4 years and there are still a whole lot of games shipping without it. It can stretch the life of your 1440p monitor. Or make your 4K TV pretty usable, for your gaming laptop or otherwise no where near 4K raster capable desktop, etc. With DLSS as an option, there isn't a reason to have anything better than ~4070 ti. Because DLSS Quality at 4K gets you solid 4K-like image quality, with 1440p framerates.

Nvidia isn't even packing a game with their GPUs.

In that recent Digital Foundry video, they said that DLSS and others of their features sometimes/often don't get implemented as well as they could, because development schedules choke out the amount of time devs can focus on it and work with Nvidia.

Well Nvidia, you should be improving your relationships with developers, so that you can be involved a lot earlier. And make sure that the final game A. has your features at all B. has them implemented well.

Elden Ring sold over 20 million in its first year. Doesn't have DLSS. Eventually released an update which added Ray Tracing. Doesn't have DLSS.

Ray Tracing in most games sucks. It barely makes a visual difference, except in a couple of games. And is a drastic performance hit. Because of that and the fact that DLSS isn't in the majority of games: AMD can sell me a card which is even worse at Ray Tracing performance, but packs a game and otherwise beats Nvidia in raster, for the price. All of their cards do this.
 
Nvidia has a big uphill battle for Microsoft and Sony consoles, and no not from a 20 year old perceived slight.
Backwards compatibility is a big impending issue that digital copies and right to repair are forcing forward. Changing to Nvidia there will essentially shatter any hopes of that, and I’m not sure Microsoft or Sony are willing to fight that fight.
The EU is already working to tackle what happens to digital copies of a game when the store is locked out of a newer console. So a PS6 might have to be able to install and run your digital PS5 games, it won’t have to upgrade textures or add features but it would have to work.
So I’m not sure Microsoft or Sony are willing to risk running afoul of that just yet. I’m sure Nvidia could develop a full translation layer or emulator set but that’s one hell of a cost plus item.

Not so sure of that - Microsoft's Hyper-V backwards emulation was pretty amazing last console gen and was the first signs I noticed how important SW was becoming (yes even MS has their moments even with Windows 11 standing right there infront of all of us) edit: and look at the HW that had to run on

Microsoft also has plenty of experience now with Windows (Xbox OS is a derivative of) on ARM

Sony's no slouch either they could pull it off if they wanted plus $ buys everything even talent - but regardless of whoever wins whatever console gen or whatever game is poorly optimized or not Microsoft usually has the better console SW IMO (UI is subjective) - and they kinda should I would hope right, SW is their day job afterall?
 
Nvidia could be doing a lot more with their developer relationships----and could really make people not want to go without their GPUs.

As it stands, DLSS is their most important tech, in terms of affecting gamers. And its not in most games. And why the hell isn't it? They've had DLSS 2.0 out for 4 years and there are still a whole lot of games shipping without it. It can stretch the life of your 1440p monitor. Or make your 4K TV pretty usable, for your gaming laptop or otherwise no where near 4K raster capable desktop, etc. With DLSS as an option, there isn't a reason to have anything better than ~4070 ti. Because DLSS Quality at 4K gets you solid 4K-like image quality, with 1440p framerates.

Nvidia isn't even packing a game with their GPUs.

In that recent Digital Foundry video, they said that DLSS and others of their features sometimes/often don't get implemented as well as they could, because development schedules choke out the amount of time devs can focus on it and work with Nvidia.

Well Nvidia, you should be improving your relationships with developers, so that you can be involved a lot earlier. And make sure that the final game A. has your features at all B. has them implemented well.

Elden Ring sold over 20 million in its first year. Doesn't have DLSS. Eventually released an update which added Ray Tracing. Doesn't have DLSS.

Ray Tracing in most games sucks. It barely makes a visual difference, except in a couple of games. And is a drastic performance hit. Because of that and the fact that DLSS isn't in the majority of games: AMD can sell me a card which is even worse at Ray Tracing performance, but packs a game and otherwise beats Nvidia in raster, for the price. All of their cards do this.
The thing is with ray tracing, is if the art department has done their job well and they have gone around and done their shadow work, and their light maps, and done their texture material properties, and worked with development do figure out their surfaces reflections and and and then it will look great, and Ray Tracing is not going to add much there.
But one is done by a team of dozens of talented artists, getting paid relatively well, to spend months if not years painstakingly combing over the game to get it all just right.
The other one is done by Jerry and $300,000 in workstation hardware over a weekend.

There is a significant cost-benefit to Jerry because the results are nearly indistinguishable as you have said, but that hardware can continue doing that for 5 years, so a single hardware purchase and one guy's salary to replace an entire team of salaried employees.
All at the cost of us needing to spend an extra $50-$100 on our next GPU purchase.

It's the "we could spend a year optimizing to use less RAM, or we could tell them to buy more RAM" argument all over again.
 
Status
Not open for further replies.
Back
Top