Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

Status
Not open for further replies.
I don't even mean do specific data for every game. That undoubtedly takes time/money. Although I think that some developers with bigger budgets, should seriously consider doing it. On console, I have no idea why Microsoft isn't using their Azure cloud to train AI upscaling specifically for every single first party game. It could give them a big quality advantage over Sony.

I just mean get some data from at least a few actual games. and apply that to their otherwise general algorithm, used for all games. I.E. they said that DLSS 3.5 has been tested on a substantially larger dataset than ever before-------and the first title with the results from that, has obvious visual problems. It would seem that only getting data from a couple of idealic test environments, is hindering the quality of new features.

And a direct example of that is----they went to the trouble of creating a tech demo of a bar which-----looks a whole lot like Cyberpunk. But....isn't Cyberpunk. And I am sure that demo's visuals are tuned to be ideal for DLSS and Ray Reconstruction. As opposed to simply partnering with CDPR and featuring a bar scene from Cyberpunk and getting DLSS + RR properly tuned for that. A real game.

Did they say how many games they use/what's in the data set? I really can't recall?
 
"...we're using a mix of data from an unreal engine sort of setup that we've built for DLSS......as well as a lot of data from Nvidia Omniverse..."

Sounds like they'd have to build more and more varied setups/environments/maps in unreal at least - throw whatever other engines with dupe maps/data they can get their hands on (that they're allowed to legally and agreed upon people)
 
I think it's because nVidia can't raster and so they quit. Gotta use voodoo and fairy dust.
 
They're just training our games on everyone's factories rendered in unreal engine - if we were playing factory simulator 2024 realism would punch you in the face 👌
I don’t know anything much about that.
But the platform is getting a lot of tie ins and integration it’s very quickly becoming a tool that is common.
Common tools get used, the more they get used the more places they end up, the more places they are the more projects that use it.

The more projects that use it the better they work with Nvidia things out the gate with less work needed later.

It becomes a sort of self fulfilling prophecy sort of deal.
 
I don’t know anything much about that.
But the platform is getting a lot of tie ins and integration it’s very quickly becoming a tool that is common.
Common tools get used, the more they get used the more places they end up, the more places they are the more projects that use it.

The more projects that use it the better they work with Nvidia things out the gate with less work needed later.

It becomes a sort of self fulfilling prophecy sort of deal.

They show it off a lot being used with 3d layouts of factories for robotics (it looked like) from all the times I remember seeing it

Edit: It's when all the AI stuff actually clicked for me - one of the times they were showing it off GTC 2016 or something or other and Jensen was on stage going "How do you train your models? You throw them in a 3D rendered world and let them run around basically. We've been rendering 3D worlds for players to run around in for years this is what we do and are uniquely set up to do!"

I was like:

1695428766741.png
 
Last edited:
Nvidia did a masterful job at turning hardware in to software lmao. Now they will just have hardware good enough on gaming and just lock out older generations from new tech and sell you the value of software upscaling. Kinda been doing that but its only going to get deeper.
 
They show it off a lot being used with 3d layouts of factories for robotics (it looked like) from all the times I remember seeing it
Ah, I suppose so ok that tracks yes. But I think that has more to do with them showing off how accurately the programmed assets simulate real life environments and how quickly remote teams can collaborate on complex tasks.
 
Ah, I suppose so ok that tracks yes. But I think that has more to do with them showing off how accurately the programmed assets simulate real life environments and how quickly remote teams can collaborate on complex tasks.

Also for like Amazon factory 'go fetch a package' droid type things
 
Nvidia did a masterful job at turning hardware in to software lmao. Now they will just have hardware good enough on gaming and just lock out older generations from new tech and sell you the value of software upscaling. Kinda been doing that but its only going to get deeper.
They have been software based since Vista, that’s been their thing for a long time.
Nvidia makes awesome software sets and tools then builds hardware designed to make it work as well as it can.
That’s their deal.
 
They have been software based since Vista, that’s been their thing for a long time.
Nvidia makes awesome software sets and tools then builds hardware designed to make it work as well as it can.
That’s their deal.

I agree, but wasn't arguing that though. I am just talking their gaming push over the last few gens.
 
I've been PC gaming since the mid-90's, and while the past details are fuzzy (or maybe I did not care back then), but was there always this much uproar on new graphical features being added to games? Why is everyone clamoring for rasterization when NVidia is actually trying to make it where that does not even matter AND you will get better graphics? I find it all confusing, I'm all about new tech that can make games look much better.

Better graphics is debatable. DLSS or DLAA to me look like crap. Better than FXAA mind you but I don’t like them. I think motion clarity is also a problem. When it’s the base image with no upscaling or any sort of AA and no post processing the image (to me plays and feels much better).
 
I agree, but wasn't arguing that though. I am just talking their gaming push over the last few gens.
I’m thinking they shifted their marketing to a if you build it they will come sort of deal.

Build the tools that make the developers lives easier, they use the tools, the tools are optimized for their hardware.

So because things get done faster and cheaper in Nvidia suites and run better on Nvidia hardware with less work more people buy Nvidia hardware because it reviews better for the products people want.
 
When it’s the base image with no upscaling or any sort of AA and no post processing the image (to me plays and feels much better).

You don't play with AA on even today? Or do you uprez/play at 110%+? I turn motion blur off 100% of the time no matter the game. Don't care if we have more life like motion blur.

Kinda been doing that but its only going to get deeper.

Like this, but 'deeper'

enhance-super-troopers.gif
 
Better graphics is debatable. DLSS or DLAA to me look like crap. Better than FXAA mind you but I don’t like them. I think motion clarity is also a problem. When it’s the base image with no upscaling or any sort of AA and no post processing the image (to me plays and feels much better).
I look at it this way, say I want 60+ fps.
What settings do I need to maintain that.
Say my card can’t, what settings do I need to turn down to achieve that, and does the game look better with upscaling as opposed to the lower settings.

Additionally is it easier from the developers perspective to implement DLSS/FSR/XeSS than it is to make the game engine stable with no shadows or lower polygon count.

Does implementing upscaling take less time and effort than finding a way to make potato mode not crash the game.

Upscaling is not for anybody who has the new stuff or can still manage 60+ FPS. But everybody else…. That’s who wants it, we here are sitting closer to the top looking down on it, but many others they see it as the things that let them get close to what people with much larger budgets already have. It’s an equalizer of sorts.

The real question do high settings with upscaling in performance mode look and feel better than that same game on all lows.

That’s the comparison, if you look at reviews starting with things on all low then work up from there then the tech looks a lot better than if you start with Max and start turning things down.
 
Okay, has anything been made with Lumen? How many games support it and has it been trialed side by side in any of them? Legitimately curious as this might be the first time I'm hearing of it.

I would say if Hogwarts Legacy is anything to go by, RT(/Lumen/whatever) is going to keep getting more and more traction as time goes on. It looked pretty good in that game.

It’s part of Unreal Engine 5, so it’s a safe bet we’ll be seeing a lot of Lumen use in the future. From what I’ve seen, it looks more or less as good as RT without nearly as much of a performance hit, so I expect we’ll be seeing lots of it.
 
  • Like
Reactions: noko
like this
IMHO it depends on the type of game. Upscaling is here to stay for AAA single player RPGs. CyberPunk 2077, Starfield, etc. These sorts of games are a bling war to make the best looking game ever.
You don't play with AA on even today? Or do you uprez/play at 110%+? I turn motion blur off 100% of the time no matter the game. Don't care if we have more life like motion blur.
You don't need AA if your pixel size is small enough. I've needed it for quite a while, but once upon a time I had a Sony GDM-FW900 24" CRT. I saw no need for AA running at 2304x1440. At that res pixels would just smear together on their own. No need to waste GPU power on AA. I'd bet it'd be about the same on a smaller 4k screen, especially if it's non-glossy with an AG coating. That'll add some smear. I run a 43" 4k screen and have it set up at programming distance to maximize usable real estate, and I need some sort of AA. DLSS quality often looks better than native with no AA.
 
You don't need AA if your pixel size is small enough. I've needed it for quite a while, but once upon a time I had a Sony GDM-FW900 24" CRT. I saw no need for AA running at 2304x1440. At that res pixels would just smear together on their own.
I had a 24" 4k ips panel once... Still needed anti aliasing or I'd see shimmer and moire patterning.
 
You don't need AA if your pixel size is small enough. I've needed it for quite a while, but once upon a time I had a Sony GDM-FW900 24" CRT. I saw no need for AA running at 2304x1440. At that res pixels would just smear together on their own. No need to waste GPU power on AA. I'd bet it'd be about the same on a smaller 4k screen, especially if it's non-glossy with an AG coating. That'll add some smear. I run a 43" 4k screen and have it set up at programming distance to maximize usable real estate, and I need some sort of AA. DLSS quality often looks better than native with no AA.

I'm at 27" @1440p things like powerlines or most things angled I notice the jaggies if playing native no upscale whatever without AA on - wish I could unsee it could turn AA off and use that elsewhere
 
I had a 24" 4k ips panel once... Still needed anti aliasing or I'd see shimmer and moire patterning.
Could be made up in my mind, but sometime I feel like CRT had a bit more natural AA going on by the way the tech work
 
Could be made up in my mind, but sometime I feel like CRT had a bit more natural AA going on by the way the tech work
I miss CRT's, but only because of all the work needed to make my Superscope work on a flatscreen...
 
Okay, has anything been made with Lumen? How many games support it and has it been trialed side by side in any of them? Legitimately curious as this might be the first time I'm hearing of it.
Fornite and Immortals of Aveum are 2 title I can think of in the wild with Lumen support.

One big advantage of Lumen is not needing to the required regular work for lighting, so chance are that many game with it will be always on like for Immortals of Aveum.


View: https://www.youtube.com/watch?v=2USR6QTMaA0&t=1s

View: https://www.youtube.com/watch?v=O6GC8TZbJmI
 
You don't play with AA on even today? Or do you uprez/play at 110%+? I turn motion blur off 100% of the time no matter the game. Don't care if we have more life like motion blur.

I almost never use any upscaling or AA of any variety, be it 1080p(laptop) or 4k(desktop). No motion blur, no film grain, no chromatic aberration nonsense, etc.

The only time I do is to test out Cyberpunk with RT (DLSS) but I actually play without it.

I look at it this way, say I want 60+ fps.
What settings do I need to maintain that.
Say my card can’t, what settings do I need to turn down to achieve that, and does the game look better with upscaling as opposed to the lower settings.

Additionally is it easier from the developers perspective to implement DLSS/FSR/XeSS than it is to make the game engine stable with no shadows or lower polygon count.

Does implementing upscaling take less time and effort than finding a way to make potato mode not crash the game.

Upscaling is not for anybody who has the new stuff or can still manage 60+ FPS. But everybody else…. That’s who wants it, we here are sitting closer to the top looking down on it, but many others they see it as the things that let them get close to what people with much larger budgets already have. It’s an equalizer of sorts.

The real question do high settings with upscaling in performance mode look and feel better than that same game on all lows.

That’s the comparison, if you look at reviews starting with things on all low then work up from there then the tech looks a lot better than if you start with Max and start turning things down.

Lots of people use DLSS to fix pixel shimmering and clean up aliasing. But for whatever reason it looks blurry to me. I’ve tried the whole sharping thing at multiple levels and it’s just a no for me. Movement with it on just puts me off, it’s annoying as all hell because I’m spoiled from playing PC games in 1990s and early 2000s with high refresh rates/ CRT. My wife and kids don’t care at all about 30fps or those things but for me it’s off putting.
 
. On console, I have no idea why Microsoft isn't using their Azure cloud to train AI upscaling specifically for every single first party game. It could give them a big quality advantage over Sony.
In the last DF interview with the Nvidia person in charge of DLSS they did talk about it a little bit why using real game as the training data was a big issue on DLSS 1.x

Real games tend to be chaotic and no 2 run being exactly the same (they could for their in-house game engine make it that it would be the case maybe, but physic engine will often tend to want to be a bit random), so if the game code does not allow to render the high and low quality image in 2 different buffer at the same time, you will never train the low to high model on the exact same image, everything particles, fire-smoke effects, etc... being an extreme example of bad training data. And often game can be limited on how good the good quality will be, do they become a bit strange at 8-10k or the 16k resolution use, are the texture-model get that good-looking, Nvidia train on pixar level render at super resolution.

According to the leaked document, ML training supersampling and others will be a focus for the 2028 Xbox, but I am not sure if they will use game for the training, maybe you can shoot them in the mix if the game engine from the start was made with it in mind, able to generate a superbe image to train on as well than lower resolution one in different buffer at the same time, using the exact same data of the current run.
 
chromatic aberration

Yeah that's another one - I leave it on if it looks good and you can't really notice it

In the titles where they make it look overdone like those old Red/Blue 3D glasses images I turn it off
 
No thanks I have an irrational fear I'm gonna drop a CRT on my foot and die like some old Microsoft guy did

And I don't even own any CRTs anymore
I have a 32” Citizen 480p TV but it’s starting to fail, I can’t find parts and when it’s gone it’s gone. I doubt I have the energy to source a donor let alone deal with the logistics of getting one shipped to me.

Old content looks far better on a CRT than an LCD. I’ll be a little sad when it finally packs it in.
 
I almost never use any upscaling or AA of any variety, be it 1080p(laptop) or 4k(desktop). No motion blur, no film grain, no chromatic aberration nonsense, etc.

The only time I do is to test out Cyberpunk with RT (DLSS) but I actually play without it.



Lots of people use DLSS to fix pixel shimmering and clean up aliasing. But for whatever reason it looks blurry to me. I’ve tried the whole sharping thing at multiple levels and it’s just a no for me. Movement with it on just puts me off, it’s annoying as all hell because I’m spoiled from playing PC games in 1990s and early 2000s with high refresh rates/ CRT. My wife and kids don’t care at all about 30fps or those things but for me it’s off putting.
DLSS looks “soft” aka fuzzy for sure, but that looks one hell of a lot better than potato mode.

Depending on the panel your using it’s hard to tell where the DLSS ends and the $399 TLC off brand Screen begins.

Using a good screen it’s obvious, using a house brand TV or display and it takes some work to figure out and in motion when you’re having fun just playing not sure you notice it at all.
 
IMHO it depends on the type of game. Upscaling is here to stay for AAA single player RPGs. CyberPunk 2077, Starfield, etc. These sorts of games are a bling war to make the best looking game ever.

You don't need AA if your pixel size is small enough. I've needed it for quite a while, but once upon a time I had a Sony GDM-FW900 24" CRT. I saw no need for AA running at 2304x1440. At that res pixels would just smear together on their own. No need to waste GPU power on AA. I'd bet it'd be about the same on a smaller 4k screen, especially if it's non-glossy with an AG coating. That'll add some smear. I run a 43" 4k screen and have it set up at programming distance to maximize usable real estate, and I need some sort of AA. DLSS quality often looks better than native with no AA.

I had a 24" 4k ips panel once... Still needed anti aliasing or I'd see shimmer and moire patterning.

The engines basically expect it to be present at this point. They all have effects that are intended to converge over multiple frames - when this isn't happening, they're going to look grainy, noisy, or fizzle.

And with PBR and actual shiny materials, shader aliasing becomes a problem. MSAA can't save you here because you're getting aliasing from something that isn't a geometric edge. You will get weird sparkles, fireflies, or shimmer.
 
MSAA can't save you here because you're getting aliasing from something that isn't a geometric edge. You will get weird sparkles, fireflies, or shimmer.
Yep, this was true back when I had the 24" panel back in early 2014. Nowadays it's a given.
 
Well, shareholders should study some basic economic theory.

It is the purpose of any business to satisfy the need of their customers at the most efficient price possible.

If the price is not efficient, then there has been a breakdown of competition, and if there is no competition, you don't have a free market, so the whole capitalist model falls apart.

When businesses try to "shape" the marketplace to their advantage, it is really sketchy, and probably should be illegal.

Businesses roles are to be neutral observers of consumer desires, and then try to fill those desires in ways that they out compete their opponents. That is it.
I hate to say this but we're "sheep."

The reason why businesses don't bend to our demands is because at the end of the day none of us are truly going to vote with our wallet if we want the absolute best option; even if that option goes against our principles and beliefs. At the end of the day--PC gamers are going to fork over the money to have the best, even if they have to do so kicking and screaming, hence why Nvidia is selling these GPU's at these prices. I looked at my options when I was upgrading and they were either spend $500 and get a 25-30% boost in performance, or spend $800 and get a damn near 70% boost in performance and next gen features, and when I upgrade my GPU I like to take big leaps, as they're the most expensive part I'd invest in, so in my mind go as big as I can or don't bother at all. So, begrudgingly spent $800+ on a new GPU because I love PC gaming and tinkering with my PC from time to time. A lot of people have that mindset as well, they're willing to fork over the cost, because just like iPhone users, people want the latest and greatest, despite more financially sound, and functionally viable purchasing decisions exist.

People say "vote with your wallet," but no one is actually going to do it in the end. Sure, they might hold out for a few years, but then they'll get that itch to upgrade and will drop the cash on that expensive new GPU/CPU because they want the latest.

I hate having to type all that because as a working class individual that's had to stay in the mid-range for the entire time he's built PC's, it sucks that for me to ever have the best I'd have to sell my first born or kidney just to do so. Unfortunately there are those out there, wallets open ready to do so and Nvidia and AMD are willing to oblige, which is why prices are where they are since people showed they were willing to pay inflated prices during the shortage and now it's come back to bite us in the ass.
 
I hate to say this but we're "sheep."

The reason why businesses don't bend to our demands is because at the end of the day none of us are truly going to vote with our wallet if we want the absolute best option; even if that option goes against our principles and beliefs. At the end of the day--PC gamers are going to fork over the money to have the best, even if they have to do so kicking and screaming, hence why Nvidia is selling these GPU's at these prices. I looked at my options when I was upgrading and they were either spend $500 and get a 25-30% boost in performance, or spend $800 and get a damn near 70% boost in performance and next gen features, and when I upgrade my GPU I like to take big leaps, as they're the most expensive part I'd invest in, so in my mind go as big as I can or don't bother at all. So, begrudgingly spent $800+ on a new GPU because I love PC gaming and tinkering with my PC from time to time. A lot of people have that mindset as well, they're willing to fork over the cost, because just like iPhone users, people want the latest and greatest, despite more financially sound, and functionally viable purchasing decisions exist.

People say "vote with your wallet," but no one is actually going to do it in the end. Sure, they might hold out for a few years, but then they'll get that itch to upgrade and will drop the cash on that expensive new GPU/CPU because they want the latest.

I hate having to type all that because as a working class individual that's had to stay in the mid-range for the entire time he's built PC's, it sucks that for me to ever have the best I'd have to sell my first born or kidney just to do so. Unfortunately there are those out there, wallets open ready to do so and Nvidia and AMD are willing to oblige, which is why prices are where they are since people showed they were willing to pay inflated prices during the shortage and now it's come back to bite us in the ass.
My post is basically “speak for yourself”. Because I’ve been more than happy to buy discounted second-hand one+ generation old hardware for at least 10 years.

I would actually say that the reasons to upgrade if all you want to do is game is declining. A majority of the market is buying video cards that cost less than $500. Here at the [H] with mega enthusiasts, is the only time you’re going to see a cross-section skewed with buying top end video cards. Because it’s their hardcore hobby.

But if you look at actual sales rates, and percentage of ownership, even ”just” Steam survey - it’s all 1660’s, RX580’s, 3060/Ti’s, etc. Most people are happy to continue to game at 1080p. And they’re also willing to wait over 4 years for an upgrade in that same low-mid to mid range that is compelling enough to be worth it. Which is likely why cards such as the 7700XT and 7800XT will sell well to buyers looking for a reasonable swing if they’re still on 2000 series or before from nVidia or 5700XT’s or RX580’s from AMD.

To get a solid midrange card is in a sense cheaper than ever before (if you include the second hand market) if all you want is 1080p medium and “over 60fps”. And that is most of the market.

Not satisfied with the current market? That’s fair, I don’t think anyone is. But we know that cards are stacking up in warehouses. And there is nothing forcing basically anyone to have to buy anything new. Vote with your wallet really does work. Be patient. And don’t care what everyone else has.
 
I hate to say this but we're "sheep."

The reason why businesses don't bend to our demands is because at the end of the day none of us are truly going to vote with our wallet if we want the absolute best option; even if that option goes against our principles and beliefs. At the end of the day--PC gamers are going to fork over the money to have the best, even if they have to do so kicking and screaming, hence why Nvidia is selling these GPU's at these prices. I looked at my options when I was upgrading and they were either spend $500 and get a 25-30% boost in performance, or spend $800 and get a damn near 70% boost in performance and next gen features, and when I upgrade my GPU I like to take big leaps, as they're the most expensive part I'd invest in, so in my mind go as big as I can or don't bother at all. So, begrudgingly spent $800+ on a new GPU because I love PC gaming and tinkering with my PC from time to time. A lot of people have that mindset as well, they're willing to fork over the cost, because just like iPhone users, people want the latest and greatest, despite more financially sound, and functionally viable purchasing decisions exist.

People say "vote with your wallet," but no one is actually going to do it in the end. Sure, they might hold out for a few years, but then they'll get that itch to upgrade and will drop the cash on that expensive new GPU/CPU because they want the latest.

I hate having to type all that because as a working class individual that's had to stay in the mid-range for the entire time he's built PC's, it sucks that for me to ever have the best I'd have to sell my first born or kidney just to do so. Unfortunately there are those out there, wallets open ready to do so and Nvidia and AMD are willing to oblige, which is why prices are where they are since people showed they were willing to pay inflated prices during the shortage and now it's come back to bite us in the ass.

Some might argue this is because there is insufficient competition in the marketplace.

It shouldn't be a matter of voting with your wallet by abstaining. It should be a matter of rejecting one player and buying from another instead, constsbtly pitting them against each other to our benefit.
 
Found this by accident and it’s adjacency related.

Microsoft GitHub update for an ML based upscaller. Looks like they have been working on it for a while.

https://github.com/microsoft/Direct...perResolution/DirectMLSuperResolution.vcxproj

The functionality is now mentioned in the Microsoft DirectML introduction page as well.

https://learn.microsoft.com/en-us/windows/ai/directml/dml-intro

Denoising and super-resolution, for example, allow you to achieve impressive raytraced effects with fewer rays per pixel.

And looking into that led me here.
https://docs.unrealengine.com/5.2/en-US/temporal-super-resolution-in-unreal-engine/
Looks like UE5.2 has its own integrated upscaler that is being worked into their tool set.
 
Last edited:
Status
Not open for further replies.
Back
Top