Nvidia would like to correct the record on VRAM it seems lol

And in came the AMD fanboy's....

I'll curse Nvidia the day my 4070Ti actually runs into performance issues due to VRAM limitations, but chances are, it won't be for at least another 2 years from now, when it's expected that my GPU just isn't going to be able to do Ultra settings at 1440p anymore due to graphical progression, and even then I'll happily drop them down to high, which is still FAR superior to anything a console has to offer, and if Frame Gen is available, I'll just enjoy the nice, smooth experience it provides.

I'll be interested to see how AMD's FG knock-off is going to be in a couple of years, since by that time, like DLSS did, Frame Gen will get better on the Nvidia side.

Oh and to the people putting it on Nvidia when it comes to these broken ass ports coming out--no. Nvidia should not, and nor should AMD give these game publishers any more reason to release broken products. Brute forcing isn't a viable solution, as the issue will only compound. Also to these very same people saying this--the publishers should, if they care about the revenue generated by PC ports, look into releasing games that work since a majority of the PC graphics market is owned by Nvidia, and the majority of those people all use 8GB or less GPU's, so if they want the market penetration, and sales numbers, they need to work with the hardware manufacturers, not against them. Anyone who could honestly expect hardware manufacturers to have to cater to game developers don't obviously see how that'll end up hurting the PC gaming industry due to how expensive shit would get real quick. If you think the 40-series is vastly over priced imagine if companies like Nvidia and AMD are having to release GPU's with a baseline of 24GB and top end at 48GB... pair that with TSMC's ever rising costs and that right there would make the 40-series look like a discount.
 
Intel can put 16GB there because they want people to buy their card as a prosumer workstation card, Intel doesn't need to deal with product segmentation like AMD and Nvidia do so they can throw everything at the wall and see who buys it. They want adoption from users, consumers, and enterprises so they are trying to offer a low to mid-range for each, something both AMD and Nvidia have failed to do, that is their foot in the door.
Yes, that's called competition. This is what that looks like. Doing stuff your competitors won't do. Makes sense.
The developers are busting their asses, laziness is not right, upper management is pinching them, so they can either pay an artist to create the asset or they can pay an artist to compress and tweak an asset so it works properly at the desired resolutions while looking good, but they can't do both because accounting already only approved $1200 for what should have been $3000.
This has always been the case.
If games were looking better at 1080p or 1440p with more assets and better quality textures that would be one thing, but if you look at the textures, and the assets, you will see the opposite
Nope. I know it's a nVidia/Digital Foundry talking point that all current games look like trash, but no. They don't. Atomic Heart maxed out looks stunning for instance and no it does not look worse than Doom Eternal.
, they have gotten worse and they are leveraging FSR and DLSS to cover the gap.
FSR and DLSS more specifically came about because of the large amount of data required to do ray tracing, and to deal with increased texture sizes. It's not a conspiracy. More textures are required in order to deliver realism. It's how you go from repeated textures where everything looks the same (aka fake) to unique characteristics in rocks, grass, etc. All of that requires more data. That amount of data is outstripping bandwidth and local storage.
Good Art departments are expensive, and making a texture work under high compression for 720p-4k and still look good takes talent, and talent you have to pay for, so a common trend right now is to use a 4K texture, use a canned compression and then let TAA/FSR/TLSS handle the rescaling where before there would have been a human working those files for the optimal results, and the canned approach does not perform nearly as well, but it cuts millions out of the art budget for a AAA title, and between memory efficiency and FPS, or a few million a year from the budget, I'll let you take a guess at which upper management is going to swing towards.
Yes, software development will look for the cheaper way out. This isn't a trend. This has ALWAYS been the case. Also "human working those files for optimal results" was not because they really wanted to do a good job. It's because games were delivered on disc. That's why compression was a big deal. It's been a big deal since the PS1, and the Sega Saturn.

With everything being downloadable game sizes have gone up but you are receiving something for it.
 
It didn't make my eyes gloss over, TLDR is 'faster and more efficient with more L2 means data moves through fast enough so we don't need as much, bandwidth 2x as fast (or something)'

But like said above I just found it funny they even felt the need to address it, didn't expect them to ¯\_(ツ)_/¯

I didn't read it but have a general observation:

Recently AMD & Nvidia have increased cache size thus reducing need for bus width. This design keeps memory transfer fast while saving die space

But unfortunately bus size is tied to vram capacity. If game exceeds this then results in stuttering / poor 1% lows

Recent video by TechYesCity
6gb vram for 1080p low
8gb vram for 4k low
12gb vram for 1080p ultra
?? for 1440p ultra
16gb vram for 4k ultra
 
Just as Nvidia doesn't care that developers want to shave hundreds of man-hours per week by choosing to use inferior cheaper toolsets or outsourced labor. It is a systemic problem.
If we had so much as a second player in the desktop GPU market then this could all have been avoided.
I'm likely playing out the life of my current system with backlog and noping out of this mess. I just don't see a light at the end of the tunnel. Maybe Intel saves us but I'm not sure since they always seem to be a gen behind.
 
And in came the AMD fanboy's....
Please stop that shit. This isn't a fan boy war like with Apple users who will defend their products. This is a simple matter of less is less and more is more. You want to defend 8GB of VRAM for what reason? You don't benefit from this.
defending billon dollar company.jpeg

I'll curse Nvidia the day my 4070Ti actually runs into performance issues due to VRAM limitations, but chances are, it won't be for at least another 2 years from now, when it's expected that my GPU just isn't going to be able to do Ultra settings at 1440p anymore due to graphical progression, and even then I'll happily drop them down to high, which is still FAR superior to anything a console has to offer, and if Frame Gen is available, I'll just enjoy the nice, smooth experience it provides.
The problem is that new games are starting to have problems with 1080p, let alone 1440p. Games like the Last Of Us, Hogwarts Legacy, RE4, and etc aren't even playable with 8GB. It's either too stuttery or too low of a frame rate. Your 4070Ti has 12GB of VRAM so of course it won't have problems for many years, because this is mostly a console generation thing and now we're seeing games make extremely bad use of PS5 and Xbox Series hardware.
Oh and to the people putting it on Nvidia when it comes to these broken ass ports coming out--no. Nvidia should not, and nor should AMD give these game publishers any more reason to release broken products. Brute forcing isn't a viable solution, as the issue will only compound.
As much as people want to blame the bad ports, that's just not gonna change. There's no reason to release a GPU in 2023 with 8GB of VRAM, other than greed. This goes for AMD and Nvidia but Nvidia is the worst at this.
Anyone who could honestly expect hardware manufacturers to have to cater to game developers don't obviously see how that'll end up hurting the PC gaming industry due to how expensive shit would get real quick.
When GPU prices went to the moon, that did more harm to hurt the PC gaming industry than anything else, but you don't see anyone caring because they made great short term profits, but ignore the long term damage. People still think that mid range cards are $1k, because they aren't tech nerds like us who eat shit sleep tech. They went and got a PS5 and won't look at PC gaming for another 5 years. This is a repeat of the mid 2000's when the GeForce 7800 GT cost $800 in 2005 money and so everyone went and bought a Xbox 360 or PS3. Remember the "But can it run Crysis" meme? That's why nobody could run the game because GPU prices were stupid then. What do you think a $400 RTX 4060Ti with 8GB of VRAM is gonna do to the game industry when you can't play games smoothly at 1080p because of the lack of VRAM?
5d88f15c2b5c6.jpg

If you think the 40-series is vastly over priced imagine if companies like Nvidia and AMD are having to release GPU's with a baseline of 24GB and top end at 48GB... pair that with TSMC's ever rising costs and that right there would make the 40-series look like a discount.
Not our problem. The only reason they have odd numbers like 24GB and 48GB is because of the fuckery with the memory bus. Instead 256-bit, they have 128-bit and 192-bit which will get you 6GB, 8GB, 12GB, 16GB, and etc. VRAM is cheap, put it on the cards. Bare minimum should be 12GB with 8GB for the $150 cards.
Intel can put 16GB there because they want people to buy their card as a prosumer workstation card, Intel doesn't need to deal with product segmentation like AMD and Nvidia do so they can throw everything at the wall and see who buys it. They want adoption from users, consumers, and enterprises so they are trying to offer a low to mid-range for each, something both AMD and Nvidia have failed to do, that is their foot in the door.
That's called competition and it's good.
 
Help me understand why gamers are so determined to defend Nvidia on their objectively bad decision when it comes to VRAM capacity. I genuinely want to know.
What I am pointing out is that it is a midrange card. I'm not saying more vRam is bad. I had a 3090 with 24Gb for 2 years, and the most vRam usage I ever saw was 15Gb, one one rare occasion. Usually it was around 12Gb or less, that was with the most demanding game I had at the time, Cyberpunk 2077. Maybe if I had been a miner the rest would have been useful.. maybe it will give the card longer legs. But 24Gb wasn't really necessary. I'm glad I had the vRam, and I sure as hell paid for it.

Now come to the 3070, it's a midrange card. Even with only 8Gb vRam, it can play any game. You might have to change your settings to "high" down from "Ultra". This is expected. People are still using cards with 4Gb vRam like 980's, they just do not run at Ultra settings. People are probably using even older cards...

Then along comes 1 shitty console port (and it was a shitty port, those happen all too often), and all of the sudden the game when played with 8Gb is having issues. Some swear it's the vRam, while ignoring all evidence to the contrary. Said game has since been patched and issues addressed (I don't own that game, so how much it has improved, I don't know). How the hell did they do this magic if it was the vRam?!?!?!? Oh, maybe it wasn't after all.

****

Of course games are evolving and some newer ones will make use of more vRam. More vRam will be better. Those changes happen over many years. As cards get older, you need to start turning down the settings in newer more demanding titles. This is a normal lifecycle of a video card.

Defending the 8Gb vRam choice? Were consumers misled as to the vRam quantity on the card they were purchasing? No, this is all upfront knowledge. If you are buying a midrange or low end card, less vRam than the high end cards come with is going to be typical. Why was 8Gb vram chosen? I expect to save money. People seem to forget that the 2xxx series launch price sticker shock was massive, and with 3xxx the prices were lower in response. Complete guess but the vRam was probably part of making that happen. That's nVidia's business decision, I don't think it needs defended, just understood. And the card can in fact play everything, so it appears it was a sound business and engineering choice both.
If you bought one and are upset that you only got 8Gb vRam but have played games for 3 years without issues, I don't think it is worth getting upset over. Turn your settings down a notch when you need to, it's still a good card.
 
What I am pointing out is that it is a midrange card. I'm not saying more vRam is bad. I had a 3090 with 24Gb for 2 years, and the most vRam usage I ever saw was 15Gb, one one rare occasion. Usually it was around 12Gb or less, that was with the most demanding game I had at the time, Cyberpunk 2077. Maybe if I had been a miner the rest would have been useful.. maybe it will give the card longer legs. But 24Gb wasn't really necessary. I'm glad I had the vRam, and I sure as hell paid for it.

Now come to the 3070, it's a midrange card. Even with only 8Gb vRam, it can play any game. You might have to change your settings to "high" down from "Ultra". This is expected. People are still using cards with 4Gb vRam like 980's, they just do not run at Ultra settings. People are probably using even older cards...

Then along comes 1 shitty console port (and it was a shitty port, those happen all too often), and all of the sudden the game when played with 8Gb is having issues. Some swear it's the vRam, while ignoring all evidence to the contrary. Said game has since been patched and issues addressed (I don't own that game, so how much it has improved, I don't know). How the hell did they do this magic if it was the vRam?!?!?!? Oh, maybe it wasn't after all.

****

Of course games are evolving and some newer ones will make use of more vRam. More vRam will be better. Those changes happen over many years. As cards get older, you need to start turning down the settings in newer more demanding titles. This is a normal lifecycle of a video card.

Defending the 8Gb vRam choice? Were consumers misled as to the vRam quantity on the card they were purchasing? No, this is all upfront knowledge. If you are buying a midrange or low end card, less vRam than the high end cards come with is going to be typical. Why was 8Gb vram chosen? I expect to save money. People seem to forget that the 2xxx series launch price sticker shock was massive, and with 3xxx the prices were lower in response. Complete guess but the vRam was probably part of making that happen. That's nVidia's business decision, I don't think it needs defended, just understood. And the card can in fact play everything, so it appears it was a sound business and engineering choice both.
If you bought one and are upset that you only got 8Gb vRam but have played games for 3 years without issues, I don't think it is worth getting upset over. Turn your settings down a notch when you need to, it's still a good card.

The problem is, it isn't "1 shitty console port". I play Caliber from time to time. I've seen it using up to 10GB VRAM. I'll HAPPILY play every game I have installed and then some and log my VRAM usage at 1440p ultra. I can assure you. at 1440p anyway (what I play at) 8GB don't cut it anymore for a LOT of titles. ANd I'm sorry, people don't buy a $600+ card to "turn settings down"
 
The problem is, it isn't "1 shitty console port". I play Caliber from time to time. I've seen it using up to 10GB VRAM. I'll HAPPILY play every game I have installed and then some and log my VRAM usage at 1440p ultra. I can assure you. at 1440p anyway (what I play at) 8GB don't cut it anymore for a LOT of titles. ANd I'm sorry, people don't buy a $600+ card to "turn settings down"
Diablo IV was using 23.5GB of the 24GB on my 4090 during the server slam. I don't get your point.
 
We have seen multiple situations where GFX cards running the same fidelity settings with similar bench mark performance look different because one card is loading low quality textures and has pop in issues.

You don’t see a big performance tank in the benchmark because the game is using low quality textures when it must and changing draw distances dynamically to handle the lack of vram.
 
Diablo IV was using 23.5GB of the 24GB on my 4090 during the server slam. I don't get your point.
My point is if a PC only game from a tiny Russian developer is using over 10gb then GoodBoy s argument of "shitty console port" is bunk. Your experience with Diablo 4 further reiterates that
 
What I am pointing out is that it is a midrange card. I'm not saying more vRam is bad. I had a 3090 with 24Gb for 2 years, and the most vRam usage I ever saw was 15Gb, one one rare occasion. Usually it was around 12Gb or less, that was with the most demanding game I had at the time, Cyberpunk 2077. Maybe if I had been a miner the rest would have been useful.. maybe it will give the card longer legs. But 24Gb wasn't really necessary. I'm glad I had the vRam, and I sure as hell paid for it.

Now come to the 3070, it's a midrange card. Even with only 8Gb vRam, it can play any game. You might have to change your settings to "high" down from "Ultra". This is expected. People are still using cards with 4Gb vRam like 980's, they just do not run at Ultra settings. People are probably using even older cards...

Then along comes 1 shitty console port (and it was a shitty port, those happen all too often), and all of the sudden the game when played with 8Gb is having issues. Some swear it's the vRam, while ignoring all evidence to the contrary. Said game has since been patched and issues addressed (I don't own that game, so how much it has improved, I don't know). How the hell did they do this magic if it was the vRam?!?!?!? Oh, maybe it wasn't after all.

****

Of course games are evolving and some newer ones will make use of more vRam. More vRam will be better. Those changes happen over many years. As cards get older, you need to start turning down the settings in newer more demanding titles. This is a normal lifecycle of a video card.

Defending the 8Gb vRam choice? Were consumers misled as to the vRam quantity on the card they were purchasing? No, this is all upfront knowledge. If you are buying a midrange or low end card, less vRam than the high end cards come with is going to be typical. Why was 8Gb vram chosen? I expect to save money. People seem to forget that the 2xxx series launch price sticker shock was massive, and with 3xxx the prices were lower in response. Complete guess but the vRam was probably part of making that happen. That's nVidia's business decision, I don't think it needs defended, just understood. And the card can in fact play everything, so it appears it was a sound business and engineering choice both.
If you bought one and are upset that you only got 8Gb vRam but have played games for 3 years without issues, I don't think it is worth getting upset over. Turn your settings down a notch when you need to, it's still a good card.

The problem with that justification is that Nvidia is telling you it's a midrange card, but they're charging you a high end price for it. 12GB at $800 is simply not acceptable. Frankly, putting 16GB on the 4060Ti is an open admission of defeat on the part of Nvidia, and they had to produce this article from their marketing department to find a way to gaslight customers into believing that the 12GB design decision made total sense. It never made sense, especially because AMD has been offering customers more VRAM at comparable price points, and the fact that Nvidia's marketing team has been stressing that RTX is a must-have feature that you should pay for, only to have that feature's performance hobbled by an insufficient VRAM buffer.

Even in the case of the 3070, modders adding 8GB of VRAM to make the card into 16GB VRAM have shown it leads to a measurable increase in performance in some titles.

If Nvidia wants to use the "it's a midrange card, get over it" marketing BS, then the 4070Ti really shouldn't be more than $600, which was the upper-mid range price two years ago. Perhaps $650 given inflation. Certainly not $800+. That's not what a mid-range card should cost in today's market and, based on sales, most consumers agree.

Again, I don't know why some people seem intent on justifying it for Nvidia. It almost seems like a coping mechanism at this point since the cards they paid $800+ for are already showing their performance is being hobbled THIS SAME YEAR by new titles coming out, forcing reductions in settings that should not be necessary.
 
I play at 1440 and have seen VRAM usage as high as 14GB in Hogwarts Legacy. Without RT.
During one of the D4 betas there was apparently a bug with VRAM leakage (or something with a similar effect), enough to bring my RX 6800 @1440p to a crawl.
 
My point is if a PC only game from a tiny Russian developer is using over 10gb then GoodBoy s argument of "shitty console port" is bunk. Your experience with Diablo 4 further reiterates that
Could it also be a point that the game simply loads assets into VRAM whether it needs them immediately or not? The fact that the VRAM is used doesn't necessarily mean that it is needed, the whole load it into ram now or later argument.
 
The problem is, it isn't "1 shitty console port". I play Caliber from time to time. I've seen it using up to 10GB VRAM. I'll HAPPILY play every game I have installed and then some and log my VRAM usage at 1440p ultra. I can assure you. at 1440p anyway (what I play at) 8GB don't cut it anymore for a LOT of titles. ANd I'm sorry, people don't buy a $600+ card to "turn settings down"
Guess what! PC gaming is expensive. If you’re expecting a two year old mid-range GPU to be able to handle AAA titles, especially ones that are broken at release… then you must be new to it. Graphics get progressively better, with that improvement they tend to need more powerful hardware. At the time the 3070 was released, 8GB was fine for mid-range ultra settings gaming, now it seems like 12GB is probably going to be the new bar if you’re looking at Ultra settings for mid-range-that’s called progression.

Does it suck for some people? Yes, but when you stick to the mid-range like I have for 21 years you get used to it, and at the end of the day if you’re upset you’re having to upgrade more often, you only have yourself to blame for not buying higher end. I personally don’t mind upgrading every 2-3 years, for me, upgrading my computer is fun.

The problem with that justification is that Nvidia is telling you it's a midrange card, but they're charging you a high end price for it. 12GB at $800 is simply not acceptable. Frankly, putting 16GB on the 4060Ti is an open admission of defeat on the part of Nvidia, and they had to produce this article from their marketing department to find a way to gaslight customers into believing that the 12GB design decision made total sense. It never made sense, especially because AMD has been offering customers more VRAM at comparable price points, and the fact that Nvidia's marketing team has been stressing that RTX is a must-have feature that you should pay for, only to have that feature's performance hobbled by an insufficient VRAM buffer.
Nvidia’s going to price based on demand. The 4070Ti outsold the 7900 series by itself, so why should Nvidia charge less? Because people on the internet feel it’s overpriced? Regardless of what the consensus was—it sold, and it being a luxury item means it wasn’t a requirement, it was a choice, and people made that choice to pay $800 for it.

As for AMD cards having more VRAM, it doesn’t matter. You think I’m 1-2 years time that the 6800XT AMD 6950XT are going to magically outperform the 4070Ti because of VRAM? What about when UE5 becomes more common and RT effects start getting implemented more and more, you think the VRAM alone will be the saving grace? No. Performance will still be king, and properly coded games only allocate what your system has available. When we reach a point to where shit has progressed far enough to where even your system doesn’t have enough resources to run games this generation of GPU’s will be old.

As for the 12GB of VRAM, it’s been perfectly fine with every game I’ve thrown at it like Hogwarts Legacy, RE4, Forspoken, Redfall, TW: Warhammer 3, CP2077 with PT on, and Witcher 3 with update. Any issues I’ve run into were issues that’s common like when loading into completely new areas and all new textures and such have to be loaded into VRAM, but even then those issues last maybe 1-2 seconds. My average usage in those games per RivaTuners total and per process usage showed around 10GB on average.
Even in the case of the 3070, modders adding 8GB of VRAM to make the card into 16GB VRAM have shown it leads to a measurable increase in performance in some titles.
When the 3070 was released games in its targeted bracket and settings weren’t using 8GB of VRAM unless modded, and anything pushing more wouldn’t be playable anyways since the GPU didn’t have the horsepower to do it—i.e CP2077. Of course giving it 16GB of VRAM would give some gains, but eventually you’ll hit a wall, the performance will fall off when you start using graphical features that push VRAM usage way up.

If Nvidia wants to use the "it's a midrange card, get over it" marketing BS, then the 4070Ti really shouldn't be more than $600, which was the upper-mid range price two years ago. Perhaps $650 given inflation. Certainly not $800+. That's not what a mid-range card should cost in today's market and, based on sales, most consumers agree.
Define most. It seems that the 4070/4070Ti alone are outselling all of AMD’s offerings by themselves. Yes, not everyone and their mother are going to upgrade because most don’t consider PC hardware a hobby, so to them upgrading every five to six years is normal, not every two. Especially when you have so many rocking 30-series cards, a lot of people don’t see the value since what they have is good enough for what they use it for.
Again, I don't know why some people seem intent on justifying it for Nvidia. It almost seems like a coping mechanism at this point since the cards they paid $800+ for are already showing their performance is being hobbled THIS SAME YEAR by new titles coming out, forcing reductions in settings that should not be necessary.
What games are hobbling the 4070Ti? Jedi Survivor at 4K ultra settings? TLOU at 4k ultra? Again, this card wasn’t designed with 4k gaming in mind, Nvidia’s even said that. As I said above, no game I’ve thrown at this GPU has “hobbled” it.

I don’t feel the need to justify Nvidia’s pricing, I don’t think anyone has. I think it stems from people’s expectations of what should be, instead of actual reality. The 4070ti for instance outclasses a 3090, a card that launched at $1600, offering massive gains over its predecessor the 3070ti, but because it’s viewed as a high mid-range card people are stuck in the mindset that the price should be the same as it’s predecessor, despite manufacturing prices going up over the last gen.
 
The problem is, it isn't "1 shitty console port". I play Caliber from time to time. I've seen it using up to 10GB VRAM. I'll HAPPILY play every game I have installed and then some and log my VRAM usage at 1440p ultra.
That's a good experiment, but many games will "allocate" vram but not necessarily use it. I don't think that Afterburner shows actual use but shows allocated. Be cool if someone figures out how to measure vRam usage with accuracy.
I can assure you. at 1440p anyway (what I play at) 8GB don't cut it anymore for a LOT of titles. ANd I'm sorry, people don't buy a $600+ card to "turn settings down"
I play at 1440p as well. What $600 card? 3070 was $500. It does suck that midrange cards prices have creeped upwards, but they all have.
The problem with that justification is that Nvidia is telling you it's a midrange card, but they're charging you a high end price for it. 12GB at $800 is simply not acceptable.
I can agree with the sentiment. But those cards sell at those prices, so it apparently is acceptable. But it is a midrange card regardless of the price.
Frankly, putting 16GB on the 4060Ti is an open admission of defeat on the part of Nvidia,
I think giving users the choice of either 8Gb or 16Gb is awesome. It could be in response to the HU noise, or it could have been planned from the beginning. Those launches are so close, and the vRam debate so recent, that I suspect it was already planned and not reactionary. That decision was probably made 4 or 5 months ago, if not longer.
and they had to produce this article from their marketing department to find a way to gaslight customers into believing that the 12GB design decision made total sense. It never made sense, especially because AMD has been offering customers more VRAM at comparable price points, and the fact that Nvidia's marketing team has been stressing that RTX is a must-have feature that you should pay for, only to have that feature's performance hobbled by an insufficient VRAM buffer.
AMD offering more vRam is their Marketing and PR guys doing their jobs, and doing it well. And of course they will try to play that up as an advantage. And in some cases it might very well be advantageous to them. They can hope that the extra vRam allows their cards to reach performance parity with the nVidia counterpart product, probably does in a few cases but I doubt it will in all.
I see that as AMD doing what they have to to help sell their cards and help shrink the performance gap between them and the nVidia cards.
It's called responding to the competition.
Even in the case of the 3070, modders adding 8GB of VRAM to make the card into 16GB VRAM have shown it leads to a measurable increase in performance in some titles.
And that was a cool experiment. But what titles showed improvement and how much? Did it turn it into a 3090? Let me search on youtube.. and find: Lol, nope. In most titles, there was very little difference:
He tests a modded 3070 with 16Gb vRam and compares with an 8Gb 3070:


1684533893074.png

1684534002081.png

1684534037521.png

And at 4k Ultra settings and Psycho raytracing:
1684534118178.png

LOL!
1684534217161.png

1684534284157.png

1684534360510.png

And he even tested The Last of Us!!
1684534425250.png


This result shows that HU was incorrect in their assumption that the 8Gb vRam was the cause of the performance issues they had with The Last Of Us. Which is what this whole fucking thread and other threads are about. Hahahahaa....
He did say that the 8Gb card was more stuttery on LoU
1684534567189.png

1684534596366.png

1684534636525.png

(NaughtyDog patch notes say they are still working on the stuttering some users have. Already they have made significant progress.)
And finally a game that shows an decent performance difference!
1684534740121.png

The biggest difference I spotted was 70fps vs 97fps:
1684534815563.png

But even then it's still playable!! Hahaha, wtf!

He tested several other games but those all shows under 8Gb vram usage and the same performance.
In the games where there actually was a difference: From less than 1% difference to 8% in one game and maybe about 35% in RE4. He would need to show the average FPS in RE4 to accurately compare that title.

It's worth noting that the 16Gb 3070 in the above tests was a triple fan card, and the 8Gb loser 3070 was a dell OEM with only 2 fans.. which could be responsible for some portion of the performance differences..
If Nvidia wants to use the "it's a midrange card, get over it" marketing BS, then the 4070Ti really shouldn't be more than $600, which was the upper-mid range price two years ago. Perhaps $650 given inflation. Certainly not $800+. That's not what a mid-range card should cost in today's market and, based on sales, most consumers agree.
Everyone wants lower priced cards including me. But wishing it was true isn't going to make it so.
Again, I don't know why some people seem intent on justifying it for Nvidia.
That's not really what's happening. It's calling out inaccurate bullshit. But the above comparisons seem to indicate that the 8Gb card had enough vRam at the time it was new, over 3 year ago. I think that is enough justification. And look at that, it can still play games!
It almost seems like a coping mechanism at this point since the cards they paid $800+ for are already showing their performance is being hobbled THIS SAME YEAR by new titles coming out, forcing reductions in settings that should not be necessary.
lol, maybe for some. My card has 24Gb so that isn't it in my case.
 
Last edited:
My problem and what forced me to upgrade to a 4070 is specifically the optimization problem at play here. Specifically with recent Resident Evil titles.

If you want the game to use more than 8GB of VRAM ok start using system RAM and run slow as shit at 1FPS or whatever. Don't just crash to desktop. Let me play at 1FPS at least.

Can't even fucking do that.

The answer is 'lol fuck you, upgrade' but the problem is optimization (or lack thereof) at the heart of it.
 
The PS5 has 16 gb so they reserve 12 gb for the GFX gram buffer. They will optimize it to keep it under 12 but will spend their finite time making the game better rather then optimizing to keep it under 8 gb and still looking the best. Instead they will just turn down the textures to potato quality to keep it under 8 gb.

Probably, but fact is many people still have 8GB cards and will for a long time. If your sequel looks uglier than the previous entry from 3 years ago that may cause a dip in sales. Of course some people will upgrade but not everyone can. With the new Nvidia and AMD offerings at $250+ having 8GB still game developers will have to work with it.
 
Last edited:
How the fuck does the 4060 have more VRAM than the 4070 & 4070ti?!
Because the GPU isn’t fast enough to handle the IO so if they go smaller they will encounter buffering issues, this gives them buffer room…:cool:
 
Good read. Developer responding to nVidia's VRAM tools.
It's pretty easy to go "devs just being lazy". Sure, I'm sure that's part of it. But you know what? The hardware is getting pretty damn lazy, too. The 4090 looks like the only actual effort and it just gets worse and worse down the stack. And before someone cries "Oh, the AMD fans are here!", no, the 7600 is getting flack too.
 
It's pretty easy to go "devs just being lazy". Sure, I'm sure that's part of it. But you know what? The hardware is getting pretty damn lazy, too. The 4090 looks like the only actual effort and it just gets worse and worse down the stack. And before someone cries "Oh, the AMD fans are here!", no, the 7600 is getting flack too.
Oh I'm waiting for AMD to ship the 7600 XT variant with 8GB. I'll be right there criticizing them. My beef is not AMD vs nVidia it's that we have a few cards that happen to come from nVidia that are woefully under prepared in the VRAM area. The 3070 is a high end part and already its going to be obsolete in one generation literally because of VRAM, not processing power or even bandwidth. That's absurd. The 3080 is also going to barely allow you to get by w/o turning stuff down.
 
Oh I'm waiting for AMD to ship the 7600 XT variant with 8GB. I'll be right there criticizing them. My beef is not AMD vs nVidia it's that we have a few cards that happen to come from nVidia that are woefully under prepared in the VRAM area. The 3070 is a high end part and already its going to be obsolete in one generation literally because of VRAM, not processing power or even bandwidth. That's absurd. The 3080 is also going to barely allow you to get by w/o turning stuff down.
All well, and good, but what reason does Nvidia have to compete? So far, they have had nothing contested in any meaningful way. As a corporation, they are acting specifically as they should, reducing cost while increasing profits. And it's as obvious as shot that AMD doesn't give a fuck and just wants to ride the gravy train by offering a second rate product as first rate prices just as much as Nvidia. The only thing keeping Nvidia ahead in those tiers is literally they brand recognition and their superior ecosystem, regardless how anyone here feels about those.

Smart ones here are just waiting and hoping that Intel comes in to take their mid-tier lunch money and force these 2 to actually compete.
 
All well, and good, but what reason does Nvidia have to compete? So far, they have had nothing contested in any meaningful way. As a corporation, they are acting specifically as they should, reducing cost while increasing profits. And it's as obvious as shot that AMD doesn't give a fuck and just wants to ride the gravy train by offering a second rate product as first rate prices just as much as Nvidia. The only thing keeping Nvidia ahead in those tiers is literally they brand recognition and their superior ecosystem, regardless how anyone here feels about those.

Smart ones here are just waiting and hoping that Intel comes in to take their mid-tier lunch money and force these 2 to actually compete.
Well I don't think it's just AMD that can apply pressure. Gamers are a loud bunch and can affect change too. The problem is we have far too many people not asking for more from nVidia when they should be. Instead we are literally seeing people ask developers to keep shoe-horning games into 8GB which is mental. I've never seen that before. People are telling GPU makers "no no, 8GB is just fine you don't have to give us more" . WTF?!?!? We're on an enthusiast site. When in your memory have we asked manufactures to stay the same? I can't remember when we've ever done this.
 
Last edited:
Well I don't think it's just AMD that can apply pressure. Gamers are a loud bunch and can affect change too. The problem is we have far too many people not asking for more from nVidia when they should be. Instead we are literally seeing people ask developers to keep shoe-horning games into 8GB which is mental. I've never seen that before. People are telling GPU makers "no no, 8GB is just fine you don't have to give us more" . WTF?!?!? We're on an enthusiast site. When in your memory have we asked manufactures to stay the same? I can't remember when we've ever done this.
The current GPU landscape is not easy, that is for sure. You either have Nvidia offering good performance with mid tier specs at high tier prices, AMD offering acceptable performance with mid tier specs at high tier prices, or Intel offering good low tier performance at good prices but playing catch.

It feels like you are damned if you do, damned if you don't and no matter what you go with. The only purchase that can be seen as somewhat sane is the 4090, which is outside the vast majority of normal gamers reach. It certainly feels like we are in bizzaro world when it comes to gaming GPUs, that's for sure.
 
The current GPU landscape is not easy, that is for sure. You either have Nvidia offering good performance with mid tier specs at high tier prices, AMD offering acceptable performance with mid tier specs at high tier prices, or Intel offering good low tier performance at good prices but playing catch.

It feels like you are damned if you do, damned if you don't and no matter what you go with. The only purchase that can be seen as somewhat sane is the 4090, which is outside the vast majority of normal gamers reach. It certainly feels like we are in bizzaro world when it comes to gaming GPUs, that's for sure.
The 4090 somehow does seem reasonable. I've been close numerous times to pulling that trigger. Still might but going to wait for refresh to see what's good.
 
So it is the texture streaming rate. Since that is a new parameter, I would say that it is not a case of bad optimization but customization for cards with less vram

What about the part where they created an entirely new set of (actual) medium mip maps that looked they actually tried this time around, while using less VRAM to boot and higher fidelity
 
What about the part where they created an entirely new set of (actual) medium mip maps that looked they actually tried this time around, while using less VRAM to boot and higher fidelity
You mean they optimized it? No fucking way! We all know that's an Nvidia fanboy excuse. 👀
 
What about the part where they created an entirely new set of (actual) medium mip maps that looked they actually tried this time around, while using less VRAM to boot and higher fidelity
Two separate issues. One is dealing with fidelity at a specific setting. The other is an optimization specifically to deal with low VRAM situations that the original target spec didn't have a problem with.
 
...My beef is ... that we have a few cards that happen to come from nVidia that are woefully under prepared in the VRAM area. The 3070 is a high end part...
3070 came out nearly 3 years ago. The only cards sold with 8Gb now are 4060's. Those are low end, so 8Gb is appropriate because people that buy those want to spend the least possible. 4050's will probably come with 8Gb as well.

3070 is not a high end part.

3050 - very low end
3060 - low end
3070 - mid tier
3080 - high end
3090 - Halo

I don't get why you are bitching about the last-gen mid-range card... now...
You are about 3 years late.
 
3070 came out nearly 3 years ago. The only cards sold with 8Gb now are 4060's. Those are low end, so 8Gb is appropriate because people that buy those want to spend the least possible. 4050's will probably come with 8Gb as well.

3070 is not a high end part.

3050 - very low end
3060 - low end
3070 - mid tier
3080 - high end
3090 - Halo

I don't get why you are bitching about the last-gen mid-range card... now...
You are about 3 years late.
The **70 series has NEVER been mid range. Not ever.
 
It has literally been in the middle of Nvidias stack for 8 years. The fuck you mean it's not mid-tier? 🤣
OK we'll play your game. Nvidia is putting 16GB on a "low end" card. How stupid does one have to be to keep telling people 8GB is all you need?
 
If you can't beat the competitions' shit you call up engineering and marketing to baffle the consumer with technological terms, fancy graphics, and other bullshit. I was going to get an RTX 4080, but I just choke on any video card that costs more than a grand. Bought an XFX Radeon RX 7900 XT reference card for $810 a few months ago instead. Nvidia is at the top of their game whereas AMD is just getting started. That means Nvidia < AMD in the long run in this cycle.
 
Back
Top