Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

Status
Not open for further replies.
Not so sure of that - Microsoft's Hyper-V backwards emulation was pretty amazing last console gen and was the first signs I noticed how important SW was becoming (yes even MS has their moments even with Windows 11 standing right there infront of all of us) edit: and look at the HW that had to run on

Microsoft also has plenty of experience now with Windows (Xbox OS is a derivative of) on ARM

Sony's no slouch either they could pull it off if they wanted plus $ buys everything even talent - but regardless of whoever wins whatever console gen or whatever game is poorly optimized or not Microsoft usually has the better console SW IMO (UI is subjective) - and they kinda should I would hope right, SW is their day job afterall?
I'm not worried about ARM, that's easy, but going from a modified RDNA architecture with a modified instruction library to a completely different Nvidia one is a very different matter.
I have absolutely no thoughts that it couldn't be done, I know it is 100% possible, but who is paying for that development time, Microsoft and Sony won't want to foot that bill unless Nvidia can bring something to the table that completely offsets it, and Nvidia wouldn't want to put in that work with no guaranteed pay off.
x86 to ARM has been forced through external market factors, ARM is gaining in popularity, and Microsoft knows they can't ignore it, certainly not in the data center, consumer options for it increase yearly and Microsoft leadership this go around seems to actually have a head on their shoulders and are working to stay ahead of things for once, and not be completely reactionary.
The ARM translation libraries exist and lots of people are contributing to those, but I doubt anybody is working on an RDNA 1.5 to Lovelace translation layer.
 
Last edited:
I'm not worried about ARM, that's easy, but going from a modified RDNA architecture with a modified instruction library to a completely different Nvidia one is a very different matter.

They were locally SW emulating 360 systems on jaguars they can do it have faith

https://www.gamesradar.com/how-xbox-one-x-emulation-technology-is-helping-to-resurrect-the-last-two-generations-of-xbox-classics-in-4k/#:~:text=“In order to make this,the Xbox icon's YouTube show.
 
But is 1000hz practically better than 800hz?
Sure 60-120, or 120-240 and yeah those are statistically significant, but 240-480, or 480 to 960? At what point is the gain while statistically significant essentially irrelevant?
From his non NDA comments, it sounds like it is beneficial up to 1000 and beyond per test groups. I can't answer personally for obvious reasons :).
 
Well we can dream, but I don't see AMD giving up that market without a fight, because if they did lose the consoles, they might as well give up consumer gaming cards because Nvidia would be guaranteed to use every proprietary software trick in their utility belt, it would be DLSS, Tensor Core, PhysX, accelerated CUDA libraries out the wazoo and Intel and AMD might as well just pack up and go home.

I don't think I like that dream.
 
Well we can dream, but I don't see AMD giving up that market without a fight, because if they did lose the consoles, they might as well give up consumer gaming cards because Nvidia would be guaranteed to use every proprietary software trick in their utility belt, it would be DLSS, Tensor Core, PhysX, accelerated CUDA libraries out the wazoo and Intel and AMD might as well just pack up and go home.

I don't think I like that dream.

AMD should get it's shit together like I said instead of relying on "we also exist" :/
 
AMD should get it's shit together like I said instead of relying on "we also exist" :/
And unless Nvidia plays major hardball they will continue to. Do I believe Nvidia could choose to drop the price on the 4060TI to $299 and still make a profit yes, could they put together a badass console SoC and sell it to Microsoft and Sony for peanuts you bet.
Do they want to bother with that sort of margin game on high volume low-profit parts while they already can't keep pace with their high volume stupid high margin parts, I doubt it.
If there suddenly exists an excess if idle fab time where they could suddenly start that venture when it wouldn't be better served by upping the volume on a different lineup sure they could make a very bad day for AMD and Jensen would ruin his family reunions for generations to come with a smile, hell I could see him sending Lisa a custom embroidered Leather Jacket as a consolation prize.
But the reality is Nvidia is too busy to care about it right now, if AMD and Intel manage to take away their desktop lead down closer to 70% then they may reprioritize, but to do that they would have to be in a place to compete better with features, software, and performance, so they would have stepped way the fuck up and then it's a moot point.

Consoles aren't a space Nvidia wants to bother with right now, hell it wasn't a month ago when everybody was all Nvidia is going to abandon the gaming market because AI is blah blah blah.

Do we want AMD to do better F-Yeah!, do we want Intel to bring the heat like a folded char from the top rope for sure, AMD isn't doing badly they just aren't #1, and coming second in a 2-player game isn't really loosing, unless you've lost to a place where you have to go home, and they are far from that spot.

Though if Intel passes them for performance and features, then oh boy do they have a day of reckoning ahead of them, because Intel will have fab space to spare and a relatively good negotiating position with Intel Foundry Services, Intel does volume like nobody else and they could bring some Console SoCs to a rock bottom place for kicks.
 
Consoles aren't a space Nvidia wants to bother with right now

This is Nvidia remember

1695273203758.png

If I told you put $5 on the most 'cut-throat wants it motherfucker' out of Intel AMD or Nvidia like 'em or not who is that? Who are you putting your money on?


Though if Intel passes them for performance and features, then oh boy do they have a day of reckoning ahead of them

But also, this and that are two of how many things right now they need to improve yesterday as pointed out - they need to get better with both AI and SW each to stay competitive in DC - again you (hypothetical) want them to rely on just their CPUs? I hope not sounds too familiar to someone else resting on 4 core for so long. And again the boogeyman Nvidia is somewhere here in the DC with us too like this

1695272967646.png


How much money AMD makes from one affects the other these things aren't silo'd off and gamers (again, not you) miss the forest for the trees on this a lot and money supplies and now Fab bottlenecks with TSMC and increasing competitors apparently with Intel and lack of improving in areas people are too happy and quick to dismiss keep adding up more and more - and compound more and more as time goes on

People just wanna act like "things are fine we got the consoles, 7xxx should have been $50 cheaper, we'll get them next time relax" - look out the window man
 
Too lazy to read through the thread but they better fix the crappy DLSS implementation that is in several games before acting like it is a win win situation. I guess lucky for Nvidia that probably half the people out there are blind as fuck and don't even see all the issues that are in several games.
 
I think a lot of people are missing what they are saying, or just didn't watch the video: The era of massive transistor scaling is over. We aren't just going to have new nodes that allow us to stuff a ton more transistors on to a wafer, and to get higher clocks while we do it. This has been true for a while now, and the scaling is only going to get worse. Each incremental improvement in lithography is going to cost more, and won't be as big a jump. Yet, we still aren't to photorealistic graphics in realtime. So what do we do?

Well one option is just to say "Ok, graphics are basically as good as they are going to get, any improvement from now on will be pretty incremental." The other is to get smarter, to figure out more clever ways to cut down on the amount of work that has to be done to give an image that looks better, and THAT'S what things like DLSS and FSR are all about. Have the GPU do less work to render a given image.

This is also, by the way, not new. There are TONS of hacks in graphics. A good example are things like bump mapping/displacement mapping. If you are doing things "properly" you render an extremely high polygon model and all the details are actual geometry, even the small ones. Ya well, our shit can't handle that, so what we do is have the artist make a high poly (or nurbs usually) model, scale that down to a level that is more reasonable, and calculate a map of the fine details to apply on it. Doesn't look as good, but looks pretty damn good and we can render that in realtime.

Things like neural upscaling and so on are just more ways to do more with less, which is what we have to do to keep pushing boundaries. Telling GPU companies to "just make more powerful GPUs" isn't reasonable. Price aside, high end GPUs are already extremely massive and power hungry. They can't just make them 10x the size and 10x the power consumption, because few people would be able to use them even if they could afford them.

While you can argue they should cost less, you really can't shit on them for not making them more performant.
 
I don't want to use something that can add ghosting, flickering or other issues in some games just to have playable framerates. I use it in most games since it is usually fine but good grief there are lots of games with issues that will likely never be fixed.
 
Last edited:
I don't want to use something that can add ghosting, flickering or other issues in some games just to have playable framerates. I use it in most games since it is usually fine by good grief there are lots of games with issues that will likely never be fixed.
Nah, just keep trotting out the dead horse that is CP2077 and pretend like everything else works just as good.
 
Shifting focus to relying on DLSS for increased performance does nothing for me.

Out of the various games that collectively comprise 99% of the time I spend playing games, not a single one of them supports DLSS, not even older forms of DLSS. I think that the only game I even own that has DLSS support is Diablo 4, which doesn't even need it, and I rarely play it.

So... if they are going to go this route, then they need to seriously commit to helping bring DLSS support to more games, including older games that are still popular and can still benefit from it.

PhysX, Ray Tracing, DLSS... they all mean nothing when your games don't use it. I don't care to hear any more marketing at this point. Where's the Beef?
 
Last edited:
This is Nvidia remember
Exactly, so Jensen is going to walk into an investor call and be all like, Hey I gave up the opportunity to make you a couple dozen billion to make you a few hundred million instead. And now we might be facing some sort of regulatory backlash for being the provider of the world's AI, and now the sole source for all the gaming consoles too.
Because Nvidia is under the regulatory eye already, their AI "Near Monopoly" has a lot of spotlights on it, add a gaming one in there too and they may not like the attention.
If I told you put $5 on the most 'cut-throat wants it motherfucker' out of Intel AMD or Nvidia like 'em or not who is that? Who are you putting your money on?
I'd put that on Intel because they are prepared to bankroll red to make this happen, their future depends on getting their GPUs good because the x86 CPU future has an end date and they need to be prepared for it, they are losing their juiciest contracts to AMD, Nvidia, and ARM, they are angry and backed into a bit of a corner. They also know a few things about fighting a court battle for playing dirty, and I can see them putting that past experience to work.
But also, this and that are two of how many things right now they need to improve yesterday as pointed out - they need to get better with both AI and SW each to stay competitive in DC - again you (hypothetical) want them to rely on just their CPUs? I hope not sounds too familiar to someone else resting on 4 core for so long. And again the boogeyman Nvidia is somewhere here in the DC with us too like this
I want them to do better, but they aren't by any measure doing bad, AMD relying only on x86 CPU sales is an entity waiting to die. Most of AMD's best is coming from TSMC's lead in fabrication, a lead that is soon to come to an end.
AMD needs to form some partnerships because doing all this solo is not going to get them anywhere, not fast at least, they are a very long way behind in the AI space both hardware and software, and they don't have the luxury of taking time to bring themselves up to a competitive space. And if they blow all their resources trying to catch Nvidia they may let something else pass them by in the process. IF Anything this is the time they should be doubling down on their GPU consumer efforts because Nvidia honestly phoned this generation in, this should have been an easy market grab for them.
How much money AMD makes from one affects the other these things aren't silo'd off and gamers (again, not you) miss the forest for the trees on this a lot and money supplies and now Fab bottlenecks with TSMC and increasing competitors apparently with Intel and lack of improving in areas people are too happy and quick to dismiss keep adding up more and more - and compound more and more as time goes on

People just wanna act like "things are fine we got the consoles, 7xxx should have been $50 cheaper, we'll get them next time relax" - look out the window man
To me, AMD feels like they have a bit of ADD brain going on, they have managed to get proficient in 30 things, with enough silicon to actually supply 10, but only enough time at TSMC to do 3 of them really well.
They are 100% reliant on TSMC, TSMC could declare that they wanted to be the Taiwanese Intel, and buy AMD and absolutely nothing would change, other than the logo. It would still be red but maybe get a little yellow mixed in to fill the gaps.
The consoles are AMD's lifeline, and as long as they have them they will be fine, but that doesn't mean they can sit on their ass, because having a lifeline is good, relying on it is not.
I said earlier that Nvidia would have to put up one hell of an offer to Microsoft or Sony to get themselves in a console and I stand by that, but if that something is double AMD's proposed GPU performance and a CPU to match at the same price that would do, emulation is a bitch but with enough power you can brute force it.
I don't think Nvidia would spend the resources to court Microsoft or Sony actively at this stage, but if either of them asked Nvidia for a treatment I don't see Nvidia turning them away either.
But if suddenly Nvidia was the world provider in AI tech, as well as what would quickly become like 90+% of the gaming tech too, oh yeah would they land themselves in some legal hot water, not because they would have done anything illegal but because there is too much money there for nobody to be taking their shot.
Maybe in this timeline, we don't get the Bell riots, we get the Jensen ones.
 
Yeah, I'd say for sure that Nvidia's thinking is future-facing, and what they're really getting at here is 8K+ resolutions.
There's just no feasible way to make it easy to run a year 2024+ modern game at 8K or up without DLSS type technology.

I also wouldn't be surprised if upscaling from 4K to 8K or 16K is their true goal. I highly doubt that they're obsessed with short-term results. They are going long, and they want the next decade, the RTX 5000 series and RTX 6000 series, to be 16K compatible.
There's infinite number of more important improvements we need than 8K/16K capability.
 
I don't want to use something that can add ghosting, flickering or other issues in some games just to have playable framerates. I use it in most games since it is usually fine but good grief there are lots of games with issues that will likely never be fixed.
That's true for a lot of shit games though, it's currently a new tool, and soon it won't be new it will be required.
Developers either aim for something that runs and looks good on hardware around an RTX 2070, or they push past that and add the upscalers, frame generators, and whatever else comes along. Because the hardware that's out there is the hardware running the stuff they want to sell, they either get better at implementing it or watch their stuff go to the Steam $5 bin.
 
WAIT GUYS

What if AI figures out how to do SLI?
They did, but you have to move like 900 GBps to make it work, and you have to code for it at a low level.
I'd be happier if they started making secondary cards.
Raster on one card, tensor on another, fuck let's bring back sound cards while we're at it.

Let us mix and match to get what we want using components built to a base standard.
 
Exactly, so Jensen is going to walk into an investor call and be all like, Hey I gave up the opportunity to make you a couple dozen billion to make you a few hundred million instead. And now we might be facing some sort of regulatory backlash for being the provider of the world's AI, and now the sole source for all the gaming consoles too.
Because Nvidia is under the regulatory eye already, their AI "Near Monopoly" has a lot of spotlights on it, add a gaming one in there too and they may not like the attention.

I'd put that on Intel because they are prepared to bankroll red to make this happen, their future depends on getting their GPUs good because the x86 CPU future has an end date and they need to be prepared for it, they are losing their juiciest contracts to AMD, Nvidia, and ARM, they are angry and backed into a bit of a corner. They also know a few things about fighting a court battle for playing dirty, and I can see them putting that past experience to work.

I want them to do better, but they aren't by any measure doing bad, AMD relying only on x86 CPU sales is an entity waiting to die. Most of AMD's best is coming from TSMC's lead in fabrication, a lead that is soon to come to an end.
AMD needs to form some partnerships because doing all this solo is not going to get them anywhere, not fast at least, they are a very long way behind in the AI space both hardware and software, and they don't have the luxury of taking time to bring themselves up to a competitive space. And if they blow all their resources trying to catch Nvidia they may let something else pass them by in the process. IF Anything this is the time they should be doubling down on their GPU consumer efforts because Nvidia honestly phoned this generation in, this should have been an easy market grab for them.

To me, AMD feels like they have a bit of ADD brain going on, they have managed to get proficient in 30 things, with enough silicon to actually supply 10, but only enough time at TSMC to do 3 of them really well.
They are 100% reliant on TSMC, TSMC could declare that they wanted to be the Taiwanese Intel, and buy AMD and absolutely nothing would change, other than the logo. It would still be red but maybe get a little yellow mixed in to fill the gaps.
The consoles are AMD's lifeline, and as long as they have them they will be fine, but that doesn't mean they can sit on their ass, because having a lifeline is good, relying on it is not.
I said earlier that Nvidia would have to put up one hell of an offer to Microsoft or Sony to get themselves in a console and I stand by that, but if that something is double AMD's proposed GPU performance and a CPU to match at the same price that would do, emulation is a bitch but with enough power you can brute force it.
I don't think Nvidia would spend the resources to court Microsoft or Sony actively at this stage, but if either of them asked Nvidia for a treatment I don't see Nvidia turning them away either.
But if suddenly Nvidia was the world provider in AI tech, as well as what would quickly become like 90+% of the gaming tech too, oh yeah would they land themselves in some legal hot water, not because they would have done anything illegal but because there is too much money there for nobody to be taking their shot.
Maybe in this timeline, we don't get the Bell riots, we get the Jensen ones.

If you told me Jensen would go after consoles just to eliminate a competitor and dig that knife in deeper, I'd 100% believe it. Besides pointing out how that could possibly play out one of many ways, that first part is really what makes it believable and people shouldn't so easily dismiss the idea is all.

One Nvidia is now worth 3 Intels and 3 AMDs combined, they could 'waste' the money on such an endeavor.

Also natural monopolies aren't illegal. So if everyone uses your product(s) for everything just because you have the best so it's what they all agree they should buy (whether manufacturers or consumers) that's just the way it can go sometimes. As long as nothing illegal took place for you to gain or maintain that monopoly.

That's why just 'not being Nvidia' isn't the shield many think it is and won't hold up in the face of "but Nvidia's is a lot better" for long
 
Last edited:
How many sells did Sony loose on the PS5 because AMD could not supply enough chips? Microsoft? I don’t see AMD as a sure provider for later gen consoles.

Intel fab success may dictate more the success for Nvidia, Sony, Microsoft and Intel. What if Apple starts using Intel Fabs? Anyways those that can provide are the ones that get the business.

DLSS can be good , crap and everything in between. Application, parts of an application dependent. I just don’t see DLSS as the long term solution, it is only a Nvidia solution now. Very rarely have I found DLSS the preferred option to use in games overall. Can be useful and just another option to use if it gives a better experience over other options.

FSR, basically avoid like the plague. As for frame generation, won’t know until I’ve tried it. What is also cool about frame generation from my thoughts is you can go native, no upscaling and maybe achieve good enough frame motion rates. Except everything I’ve seen seems to have artifacting.

As for 1000hz displays, lol, just have to wait and see. 8k for me is almost useless and not significantly better than 4k for screen sizes and distances I use. Yeah, if I had a wall size display but don’t and don’t care to have one at this time.

Time to play more games, relax, have some fun.
 
I see DLSS as an option to push graphics beyond what is possible by just using native resolution rendering.
Given the choice of worse graphics with no DLSS or better graphics with some upscaling that's for the most part undetectable by the naked eye, who would choose the former?

And those who have raised their hand defiantly: Good news, you already have that, turn off DLSS and lower graphics settings, turn off RT, voila.
 
I see DLSS as an option to push graphics beyond what is possible by just using native resolution rendering.
Given the choice of worse graphics with no DLSS or better graphics with some upscaling that's for the most part undetectable by the naked eye, who would choose the former?

And those who have raised their hand defiantly: Good news, you already have that, turn off DLSS and lower graphics settings, turn off RT, voila.
Turning on DLSS is lowering graphics 😁 (more times than naught)

Other options can include lower resolution but not scaled with max settings. Some games I preferred the 21:9 ratio over the 4K 16:9. With better fps to boot!

AMD boost is also a neat trick for quick movement, turns, low latency that is not as noticeable as DLSS on all the time.

Other options are dynamic scaling

When performance is not where you want, there are a lot of viable options, DLSS is just another option for me, FSR so far has not been that viable.
 
Turning on DLSS is lowering graphics 😁 (more times than naught)
No, turning on DLSS is lowering resolution and then upscaling it back to native with AI that was trained to do just that.
Other options can include lower resolution but not scaled with max setting..
Meaning you just let your monitor scale it up for you, probably without any filtering.

I've seen many bad takes in my time, but saying DLSS is worse than simply playing at lower resolution is at least Top 3 material.
 
Am I the only one who likes native resolution with 0 post effects? The first thing I do is set to native full screen, 120hz cap, turn on vsync then turn off anything blur related, and these weird effects that just muddy the image or put a filter over it.
 
DLSS will always be a compromise.

Not going to lie, Nvidia's DLAA filter is pretty awesome, and I would like to see more games offer utilizing it at native resolution, as it is by far the best AA I have used from a balance of performance, effectiveness at removing AA, and sharpness perspective.

Scaling the resolution up - however - is always a compromise. I don't for a second buy their claim that DLSS looks better than native, because - well - I have eyes. I have seen what it looks like in game.

This can go one of two ways though:

A) Developers can go the way of Bethesda and make games like Starfield which look mediocre, but still demand ourageous system resources, and just put scaling on it as a band-aid,

--OR--

B) They could actually make games that have higher polygon counts, more RT and really look quite awesome, but at the time of launch the hardware to run them natively doesn't exist yet. It would be sortof as if when Crysis launched, you could run the outrageously heavy game at scaling on lesser hardware. It wouldn't be as good as native, but you would understand why, because the game really looks great, and you would be willing to put up with it.


Unfortunately doing a shit job, and using scaling as a band-aid seems more up the alley of most developers, so unfortunately that is probably what we'll get.
DLAA is native resolution. Instead of rendering the game at a lower resolution and then upscaling it like DLSS, DLAA renders the game at native resolution and uses AI to reconstruct it at native resolution to clean up the image.
Am I the only one who likes native resolution with 0 post effects? The first thing I do is set to native full screen, 120hz cap, turn on vsync then turn off anything blur related, and these weird effects that just muddy the image or put a filter over it.
There is no such thing as a game with no "post effects" after around 2002.
 
They did, but you have to move like 900 GBps to make it work, and you have to code for it at a low level.
I'd be happier if they started making secondary cards.
Raster on one card, tensor on another, fuck let's bring back sound cards while we're at it.

Let us mix and match to get what we want using components built to a base standard.

There's obviously way more to it than I would like to think but I'm going to blame Nvidia for talking out both sides of their face regarding these technologies.
Frame generation (to smooth brained intellectuals like myself) seems like repurposed SLI Tech.
 
In other words, they've given up on improving actual performance.
In other words, you'll get less VRAM and like it.
Can we really blame them when Devs keep releasing games with PS4 graphics and worse performance over and over?

I have no doubt that they will continue to lead the pack in performance regardless of dlss on or off. But I can't blame them for pushing dlss so heavily when devs keep releasing games with dog shit performance and last gen graphics. We have people touting BS claims like 7900XTX keeping pace and even outperforming 4090 because a lot of these games seem to be hampering performance on the competitors HW. It's fucking wild man.
The problem with your claims is that you aren't a programmer, and you don't have access to their source code. Your claims are pulled from the ether.
Guilty until proven innocent right? We don’t need proof. We have rumors and tweets.
Jedi Survivor and Starfield both got DLSS. You would think that's enough to shut people up about AMD paying devs to not implement DLSS. Apparently these people are playing 3D chess because their thought process is beyond comprehension. It just works... out that AMD is still paying devs not to include DLSS.
 
Jedi Survivor and Starfield both got DLSS. You would think that's enough to shut people up about AMD paying devs to not implement DLSS. Apparently these people are playing 3D chess because their thought process is beyond comprehension. It just works... out that AMD is still paying devs not to include DLSS.
Jedi Survivor got DLSS 5 months after the game launched, and Starfield doesn't have it yet.
 
Bring on all the DLSS 3.5+ technology... I love fully maxing out games at 4K with all the eye candy. The article has a lot of good points in favor of their focus, and I tend to agree with most of them.

RT + DLSS + FG is going to be the future. The point that stuck out to me is rasterization is a huge hack too and just as fake. At least with Path Tracing and AI regeneration you are getting and even more realistic image. I have been using DLSS since it came out, it has improved A LOT. Still has issues every so often depending on the game, but overall, at 4K, my eyes tend to think DLSS looks better than native with TAA. If you have the horsepower to use DLAA, it looks even better.
 
Jedi Survivor got DLSS 5 months after the game launched, and Starfield doesn't have it yet.
But Jedi did get it, and Starfield will get it, right? You cant honestly expect every game to have DLSS day 1. It's clear that DLSS isn't as easy to implement properly as some modder did. To get DLSS working properly the devs need to make sure the game is stable and there's no bugs or artifacts. Assassin's Creed Mirage will only have XeSS support, so you think Intel paid devs not to include FSR and DLSS? Game devs have only so much time to work on games, so it'll likely just be one upscaler for now. I'm sure Assassin's Creed Mirage will get DLSS at some point, with FSR being a maybe since XeSS also works on AMD hardware as well as Nvidia.
 
The point that stuck out to me is rasterization is a huge hack too and just as fake.

Made that point here too on these forums a while ago with regards to mip maps and anti-aliasing and normal maps, when everyone was screaming "no DLSS is fake and ray tracing is unimportant because of cost for so little visuals only raster matters and ever should and ever will and only cheaters would try anything else"

The future comes for believers and unbelievers all the same :)
 
Last edited:
That's true for a lot of shit games though, it's currently a new tool, and soon it won't be new it will be required.
Developers either aim for something that runs and looks good on hardware around an RTX 2070, or they push past that and add the upscalers, frame generators, and whatever else comes along. Because the hardware that's out there is the hardware running the stuff they want to sell, they either get better at implementing it or watch their stuff go to the Steam $5 bin.
They can't get it right even in some of the biggest games released. Dead Space remake has ghosting and DLSS and DLAA are both garbage in the Last of Us part 1. And when they added it to the Shadow of the Tomb Raider, it introduced lots of issues such as breaking some water animations and causing flicker on some of her outfits. Those are ridiculous things that shouldn't happen in any game and yet none of it was ever fixed.
 
If AMD wants to win the AI market on the consumer side, all they need to do is release cards with more VRAM. That's it. 32 GB, 48 GB, even 64 GB or more, and immediately, raw performance would take second-place to the ability to actually load in huge data models.
But, don't they have workstation cards they want to sell that have high vram like that? They probably don't want desktop cards 'competing' in that market. Imho, what they could do is stop the 8gb, 16gb, 20gb & 24 gb choices and just limit it to 20gb and 24gb cards in all ranges - that way they are 'cheap' ML/AI alternatives to Nvidia for the 'budget' consumer who wants to use a gpu for gaming, productivity and AI - supposedly, AMD did make some advancements in the AI field - but, whether these assertions can be proven true and accurate in the real world remains to be seen.
 
I'm sorry, but this is just depressing. Nearing the end of a hobby for me. What the industry is tell me is we're an afterthought now while this tech has better things to do at higher margins. It was a fun 25 years or so.
 
supposedly, AMD did make some advancements in the AI field - but, whether these assertions can be proven true and accurate in the real world remains to be seen.

Yeah besides yet seeing if any claims and real world impact of the MI300 are true, they're also having issues getting it out the door after talking about it ala FSR3 frame generation, and then the newly announced problems @ TSMC by TSMC could be yet another additional roadblock here

All while people are still using and clamoring for any and all Nvidia A or H class AI GPUs they can get their hands on (plus even consumer GPUs down to 4070 non-ti are finding use in mom and pop shop AI setups - not even talking about down to guy in a garage level here just slightly bigger - as you can see in the new MLID)

Like I said there's like 5 perfect storms going on simultaneously outside if you look out the window
 
Last edited:
Jedi Survivor got DLSS 5 months after the game launched, and Starfield doesn't have it yet.

From what I've read, the DLSS implementation in Jedi Survivor was somehow horribly botched by the developer, as well. To the point where the mod might be better? lol
They can't get it right even in some of the biggest games released. Dead Space remake has ghosting and DLSS and DLAA are both garbage in the Last of Us part 1. And when they added it to the Shadow of the Tomb Raider, it introduced lots of issues such as breaking some water animations and causing flicker on some of her outfits. Those are ridiculous things that shouldn't happen in any game and yet none of it was ever fixed.

Crappy implementations are always going to happen for any tech. It's kind of funny you mention that, because I'm using FSR in Starfield. After reading up on some of the issues with FSR vs the DLSS mod in that game, I can't unsee the issues FSR has. Some of the flickering ones are especially the worst. The fact is, if a modder can implement it shortly after release, developers have very little excuse, outside of sheer incompetence, to not do so in other games.

Now, if you want to see something amazing, try playing No Man's Sky. First with native resolution and some standard antialiasing implementation. Then, turn on DLAA in it. The entire game will look like it had basically gone forward a few game and GPU generations.

Done wrong, anything is bad. Done right, DLAA and DLSS, pound for pound, easily kick FSR and all of the other competing technologies to the curb. In some games, DLSS looks better than native, and DLAA beats competing AA options while being less expensive. In Starfield, the DLSS mod advantages over the native FSR implementation... which AMD helped bake in themselves. That's pretty bad.

And that's a problem. Lakados has a point by saying that we're not quite at a monopoly, because of AMD keeping a grip in the console space and still somewhat keeping up in the Ray Tracing space (though they're a gen behind, I guess in hindsight a gen isn't quite terrible considering where they were at last gen). And Nvidia is apparently harder to work with for devs. So, they have time... But they really need to somehow start catching up. Otherwise, Nvidia might start offering low end cards that have enough AI crap baked in that they might as well be high end cards for all anyone in the casual (high volume) space actually cares, and I am sure console developers will notice that, and then game developers will, and then it cascades from there. We don't need a monopoly.
 
Status
Not open for further replies.
Back
Top