The Slowing Growth of vRam in Games

Me thinks this has been a tad overthunk...all you had to say is:

If you have enough vram to run the "Can it run Crysis" app at reasonable quality, they you're all good, at least for a little while...:p /s

Now don't get me wrong, I think charts & graphs & spreadsheets have their uses (I use them alot for work), but the simple fact is that they can't actually tell you what your gamin experience will be like on YOUR machine on YOUR monitor.....

Having said that, I think that if you NEED someone like the OP to do all that work just to figure out which games to play or which ones you will be able to play, then you might wanna consider anutha hobby :D
 
Shiiit. I am sad (and furious) to see my 3070 being outdated this quickly and only because of VRAM and not GPU performance itself. Not that there is much help from 4000 serie, Nvidia is still being stingy about their vram amounts.

I am surprised how much VRAM games use these days. When 3070 was released you could play Doom Eternal at 4k at almost max settings and it ran really quick. This was without raytracing though but the game still looks stunning and 8gb was (barely) enough. Now 3070 struggles at 1440p even with raytracing disabled. 😭 *Edit* struggling as in barely capable of stable 60fps.
Don't stress over it.

Change settings from Ultra to High, problem solved. It will likely be just fine for years. In 4 years you might need to change settings to Medium.

This is the lifepath of all video cards.
 
2080 super for 1080p medium & 3080ti for 1440p high !?

So many questions.

https://twitter.com/HardwareUnboxed/status/1649401870249771008?s=20

FuPZ92OakAMLf9U.png



Also AMD should probably release a Navi 33 16gb version for $375-$400 along with a regular Navi33 8gb for $300-$325

AMD Radeon RX 7600S RDNA3 laptop GPU tested, 6% slower than RTX 4060 mobile without raytracing

https://videocardz.com/newz/amd-rad...ted-6-slower-than-rtx-4060-without-raytracing

German tech site ComputerBase has one of the first reviews of the discrete RDNA3 graphics for laptops.

(From conlusion page: the 8gb 28 CU mobile gpu — full navi 33 has 32 CU — runs into VRAM limits when testing in 1080p & textures have to be dialled down)

Specs for 4060 mobile (full AD107) in link below: (approx half of 4070 desktop)

https://www.tomshardware.com/news/nvidia-ad106-and-ad107-gpus-pictured
 
Just started watching this video

He claims 4gb vram insufficient for 1080p low in these 4 modern games:
  1. Modern Warfare 2
  2. Hogwarts Legacy
  3. The last of us part 1
  4. Spiderman Miles Morales
(Will update for 8gb/12gb when I finish watching this)

EDIT:
minimum VRAM requirements going forward:
6gb — 1080p low
8gb — 4k low
12gb — 1080p ultra
16gb — 4K ultra with RT

The question remains:
Will 12gb be sufficient for 1440p ultra with RT !?

 
Last edited:
  • Like
Reactions: erek
like this
2080 super for 1080p medium & 3080ti for 1440p high !?

So many questions.

https://twitter.com/HardwareUnboxed/status/1649401870249771008?s=20

View attachment 565658


Also AMD should probably release a Navi 33 16gb version for $375-$400 along with a regular Navi33 8gb for $300-$325
First question is why do we debate these fucked up charts made by low IQ twats?
To the point a 2080super is tried and true the superior GPU vs. the 5700XT. Unless this is a massively AMD swayed title it makes no sense. A quick exam of the other tiers show that this stat is an outlier.
I have yet to see a "recommended spec list" that i wasn't pissed about. The stupidness is great. Often CPU is fucked or GPU is in this case but seems that Ned, the bumpkin intern is given full control. And the goof is just a cosplayer!
 
Just started watching this video

He claims 4gb vram insufficient for 1080p low in these 4 modern games:
  1. Modern Warfare 2
  2. Hogwarts Legacy
  3. The last of us part 1
  4. Spiderman Miles Morales
(Will update for 8gb/12gb when I finish watching this)

EDIT:
minimum VRAM requirements going forward:
6gb — 1080p low
8gb — 4k low
12gb — 1080p ultra
16gb — 4K ultra with RT

The question remains:
Will 12gb be sufficient for 1440p ultra with RT !?


4 GB is not enough for an rtx 4090 or a 7900xtx?

Genius tier analysis.
 
The question remains:
Will 12gb be sufficient for 1440p ultra with RT !?

Ignore the garbage tier video you just watched. The amount of vram you "need" depends on 3 things:
1.) Res/ settings - 1440p rt in this example
2.) Game
3.) Performance gpu has to push settings

So... if you are playing CoD 7 2025 on an rtx 6090ti at 1440p rt, then no 12 gb is not enough.

For an RTX 4070, it likely would not be enough though that doesn't matter as it doesn't have enough horsepower anyway.

For current games at 1440p rt and future games like CoD 7 running max PLAYABLE settings, 12 gb is likely enough.
 
Ignore the garbage tier video you just watched. The amount of vram you "need" depends on 3 things:
1.) Res/ settings - 1440p rt in this example
2.) Game
3.) Performance gpu has to push settings

So... if you are playing CoD 7 2025 on an rtx 6090ti at 1440p rt, then no 12 gb is not enough.

For an RTX 4070, it likely would not be enough though that doesn't matter as it doesn't have enough horsepower anyway.

For current games at 1440p rt and future games like CoD 7 running max PLAYABLE settings, 12 gb is likely enough.
You know games are usually made for the lowest common denominator, not the highest, they actually want to sell a lot of games not a couple hundred or thousand.
 
  • Like
Reactions: erek
like this
One more:

16GB RTX 3070 Mod Shows Impressive Performance Gains​


https://www.tomshardware.com/news/3...flow&utm_source=twitter.com&utm_medium=social

YouTuber Paulo Gomes recently published a video showing how he modified a customer's RTX 3070, which used to be one of Nvidia's best graphics cards, with 16GB of GDDR6 memory. The modification resulted in serious performance improvements in the highly memory-intensive Resident Evil 4, where the 16GB mod was performing 9x better than the 8GB version in the 1% lows.
 
One more:

16GB RTX 3070 Mod Shows Impressive Performance Gains​


https://www.tomshardware.com/news/3...flow&utm_source=twitter.com&utm_medium=social

YouTuber Paulo Gomes recently published a video showing how he modified a customer's RTX 3070, which used to be one of Nvidia's best graphics cards, with 16GB of GDDR6 memory. The modification resulted in serious performance improvements in the highly memory-intensive Resident Evil 4, where the 16GB mod was performing 9x better than the 8GB version in the 1% lows.

More discussion here:

Love this kind of stuff

“As said, the recent mod is a bit more complicated than the earlier one done on some earlier graphics cards, as some resistors needed to be grounded in order to support higher-capacity memory ICs, and the modded graphics card had to be set to high-performance mode in the NVIDIA Control Panel, in order to fix flickering.

AMD marketing has recently called out NVIDIA and pulled the VRAM card, but with NVIDIA launching the GeForce RTX 4070 with 12 GB of VRAM, it appears this won't change anytime soon. These mods show that there is definitely the need for more VRAM, at least in some games.”



Source: https://www.techpowerup.com/307724/...b-of-vram-shows-impressive-performance-uplift
 
8gb 3070 vs 16gb 3070 in modern games

👇
Wow, that's annoying for 3070 owners. I still don't think the 12 gb 4070 will have a vram to performance ratio that is as bad as the 8 gb 3070. It's also better than the 10gb 3080.

As far as "lowest common denominator" for future midrange performance in regards to this ratio, the 4070 is not it.
 
As Steve mentions, I would be much more apprehensive for 8GB 4060 cards than 12 gb 4070 cards.

This is what will really screw up future game developments midrange graphics settings, to include those designed for the PS5/series X.
 
But But its all these un-optimized games are the reason why 8GB isn't enough right?

Joking aside, wow I didnt even think about showing this kind of comparison. This goes to show you that 12GB is not going to be enough sooner rather than later.

Maybe if Nvidia doubled the memory on the 4070/4070ti to 24GB they might of sold better.
 
I think it's painfully clear that 12GB is going to be for 2.5k and below for medium settings very soon. And as the 3070 videos demonstrate, that vRAM size will be choking mostly whatever graphics cards they're attached to and not that the GPU itself isn't fast enough to perform given the correct amount of vRAM. 16GB is becoming the minimum if you want High/Ultra settings in 2.5k, and ideally 20GB.

It's laughable that a 6800XT performs better in RT ever vs a 3070, and it comes down to vRAM.
 
But But its all these un-optimized games are the reason why 8GB isn't enough right?

Joking aside, wow I didnt even think about showing this kind of comparison. This goes to show you that 12GB is not going to be enough sooner rather than later.

Maybe if Nvidia doubled the memory on the 4070/4070ti to 24GB they might of sold better.
That’s why TLoU already has had like 5 or 6 patches including a 25GB patch, it’s completely unoptimized lol, and they’re still working on releasing more patches as well. I agree 8GB is becoming on the verge in select titles but to say it’s obsolete is disingenuous. At 4K it is, but that resolution is pointless anyway for PC gaming, you’re better off with a console for 4K considering the cost for good performance to run AAA games at 4K on PC is simply a fools errand right now, unless you have money burning a hole in your pocket that is.

So overall, yes almost all of the games they’re using to make this argument aren’t well optimized.
 
That’s why TLoU already has had like 5 or 6 patches including a 25GB patch, it’s completely unoptimized lol, and they’re still working on releasing more patches as well. I agree 8GB is becoming on the verge in select titles but to say it’s obsolete is disingenuous. At 4K it is, but that resolution is pointless anyway for PC gaming, you’re better off with a console for 4K considering the cost for good performance to run AAA games at 4K on PC is simply a fools errand right now, unless you have money burning a hole in your pocket that is.
That’s just 1 game…..lol
 
I think it's painfully clear that 12GB is going to be for 2.5k and below for medium settings very soon. And as the 3070 videos demonstrate, that vRAM size will be choking mostly whatever graphics cards they're attached to and not that the GPU itself isn't fast enough to perform given the correct amount of vRAM. 16GB is becoming the minimum if you want High/Ultra settings in 2.5k, and ideally 20GB.

It's laughable that a 6800XT performs better in RT ever vs a 3070, and it comes down to vRAM.
I think you’re drinking the kool-aid if you think 12GB VRAM is going to be relegated to 1440p medium settings only, in a very short period of time let’s say within the next 6-18 months. Maybe in some very specific niche cases but those will be exceptions, not standards, if it happens at all. It’s overall a push to sell AMD and make them look better for having more VRAM on their cards, even though AMD has worse VRAM utilization. I still think Nvidia dropped the ball on the 4070 Ti and 4070 and somehow should’ve managed 16GB as a minimum just to be safe especially at the asking MSRP.
 
Last edited:
And people frequently complain about low texture resolution. 8GB is enough if you're satisfied with mediocrity, I guess. Frankly, with the push for 8K by the display industry I'm surprised we haven't seen 16GB cards in the consumer space, yet.
What people forget is that as consumer TV screen sizes increase the dpi decreases hence the need for 8K. Most screens above 34 inches at 4K have the dpi of 1080 screens. 50 inch 4K is 88 dpi, 100 inch 4K is 44 dpi.
 
What people forget is that as consumer TV screen sizes increase the dpi decreases hence the need for 8K. Most screens above 34 inches at 4K have the dpi of 1080 screens. 50 inch 4K is 88 dpi, 100 inch 4K is 44 dpi.
In relation to what size 1080p screen, that matters as well. That’s why I prefer gaming on a monitor, I personally prefer 34” 21:9 1440p right now as the sweet spot, outside of competitive gaming, which doesn’t interest me very much. Who’s gaming on a 100” TV lol, at that size I would get a projector and viewing distance is also the other factor as NightReaver pointed out.
 
Last edited:
Forspoken is obviously unoptimized, Hogwarts Legacy, unoptimized in some locations and generally AMD favored title. I guess RE 4 maybe be the one case. That’s one game. Compared to how many games that are very playable?
So since games dont run great on someones hardware, they are all unoptimized?

Sounds like either its the user, older hardware, or......possibly running out of vram which has been proven to be the case when someone puts 16gb of memory on a 3070 lol.
 
So since games dont run great on someones hardware, they are all unoptimized?

Sounds like either its the user, older hardware, or......possibly running out of vram which has been proven to be the case when someone puts 16gb of memory on a 3070 lol.
No, just a misunderstanding. Higher res textures, more objects each with specific textures, shaders, bigger game worlds, higher polygon count, add in BVH, RT where you can’t cull as much and have to have more objects being rendered eats VRAM quick.

If one wants better looking, playing games you better have way more vram than the consoles. Consoles vram is not as limiting either due to how fast and direct the ram is to the GPU, CPU and SSD. 12gb is questionable for newer AAA games coming out without some limitations. 16gb and more seems to be the right answer to not have vram issues.
 
In relation to what size 1080p screen, that matters as well. That’s why I prefer gaming on a monitor, I personally prefer 34” 21:9 1440p right now as the sweet spot, outside of competitive gaming, which doesn’t interest me very much. Who’s gaming on a 100” TV lol, at that size I would get a projector and viewing distance is also the other factor as NightReaver pointed out.
Funny when we quote a custom gaming PC if it is over 800 USD the first remark made is how much can I get a console for? Most of our customer base own TVs above 65 ", this is the start of the PC gaming decline IMHO.
 
So since games dont run great on someones hardware, they are all unoptimized?

Sounds like either it’s the user, older hardware, or......possibly running out of vram which has been proven to be the case when someone puts 16gb of memory on a 3070 lol.
Yes a game requiring more VRAM than it should, can be due to poor optimization…there’s many examples of this, and it’s not like those games are vastly more detailed than the top graphical games that have already been out. PC is more difficult and more time consuming to optimize for due to the amount of variables involved.
 
Last edited:
PC ports are just by nature going to be less optimized than versions made for specific hardware.
Yes, 12GB will most likely be enough until next gen consoles as the consoles only have about 11-12GB VRAM allocated for games. Of course the more optimized a game is for PC hardware the better it will perform. Don’t get me wrong I still think it’s messed up that the 3070/Ti were gimped with 8GB and perform better plus would have greater longevity with 12/16GB which shouldn’t have been an issue for Nvidia to provide to it’s consumers, it’s not acceptable, but it’s also overblown. Both can be true at the same time.
 
Funny when we quote a custom gaming PC if it is over 800 USD the first remark made is how much can I get a console for? Most of our customer base own TVs above 65 ", this is the start of the PC gaming decline IMHO.
There’s always market cycles, they said similar things when the 360 and PS3 launched, they were graphically ahead of PC hardware, especially for the $ for the average consumer. Only time will tell, I don’t think PC gaming is going away personally. There’s always going to be people willing to pay more for their preference. If supply chains get back to where they were, inflation reduces and they learn to limit scalpers, decent prices will naturally come back. Otherwise these pricing trends will only continue in this market climate. Here’s hoping for an amazing 50 series / RDNA 4 launch cycle!
 
Last edited:
There’s always market cycles, they said similar things when the 360 and PS3 launched, they were graphically ahead of PC hardware, especially for the $ for the average consumer. Only time will tell, I don’t think PC gaming is going away personally. There’s always going to be people willing to pay more for their preference. If supply chains get back to where they were, inflation reduces and they learn to limit scalpers, decent prices will naturally come back. Otherwise these pricing trends will only continue in this market climate. Here’s hoping for an amazing 50 series / RDNA 4 launch cycle!
It didn't last long, but when it was X1800 XT and XTX and the 7800 GTX against the Xbox 360 it wasn't much of a contest at the time. As soon as the 8800 GT launched for 250 dollars, it was no longer a contest between PC and consoles any more (pc was pretty significantly ahead), and also reasonably affordable. I'm not even sure RDNA 4 and blackwell will get us out of this mess considering how bad pricing is compared to consoles. Maybe if a 5070 matches 4090 levels of a performance and the price comes back down to 500-550, the low end will actually give people some value. I think the main issue is software at the moment though, it's not really a hardware issue. Developers need more powerful plugins to handle things like texture streaming to compete with consoles traditional streaming methods are okay if implemented, but they're really not good enough anymore, they also need time to properly multithread their games (Jedi Survivor is almost single-threaded it pegs two threads and everything else idles). For AAA games, there may need to be NVME hardware requirements that need met to run the game without stuttering as well. People should still be allowed to run the game on lesser hardware, but make it known they're going to have stuttering. Drives are so cheap right now I don't see why we don't push for NVME as a standard.
 
It didn't last long, but when it was X1800 XT and XTX and the 7800 GTX against the Xbox 360 it wasn't much of a contest at the time. As soon as the 8800 GT launched for 250 dollars, it was no longer a contest between PC and consoles any more (pc was pretty significantly ahead), and also reasonably affordable. I'm not even sure RDNA 4 and blackwell will get us out of this mess considering how bad pricing is compared to consoles. Maybe if a 5070 matches 4090 levels of a performance and the price comes back down to 500-550, the low end will actually give people some value. I think the main issue is software at the moment though, it's not really a hardware issue. Developers need more powerful plugins to handle things like texture streaming to compete with consoles traditional streaming methods are okay if implemented, but they're really not good enough anymore, they also need time to properly multithread their games (Jedi Survivor is almost single-threaded it pegs two threads and everything else idles). For AAA games, there may need to be NVME hardware requirements that need met to run the game without stuttering as well. People should still be allowed to run the game on lesser hardware, but make it known they're going to have stuttering. Drives are so cheap right now I don't see why we don't push for NVME as a standard.
When I installed my MW 2 to a NVMIE drive voila no stuttering !
 
So, what's better for gaming/VR, a 16gb 4080 or a 20gb 7900 XT? It depends if the game is optimized for AMD or Nvidia?
 
I would suspect that is the case, any current or recent benchmarks for VR performance? I also suspect he meant 7900XTX vice XT.
Well the XTX also has 24GB instead of 20GB VRAM. I’m sure some people now are going to assume more VRAM = better GPU. The 4080 is definitely better than the 7900XT, especially when it comes to VR and Ray Tracing etc., while having 4GB less VRAM. 4080’s price is bad though, basically $1,300 after tax at minimum.
 
Well the XTX also has 24GB instead of 20GB VRAM. I’m sure some people now are going to assume more VRAM = better GPU. The 4080 is definitely better than the 7900XT, especially when it comes to VR and Ray Tracing etc., while having 4GB less VRAM. 4080’s price is bad though, basically $1,300 after tax at minimum.
Nothing to do with this atm. AMD 7000 series has an issue with VR. This could be hardware or drivers, but I think if it wasn't hardware it would have been corrected by now.
Personally I think there is something causing a micro latency and that VR applications are suffering because of it.
The 6000 series as good as they are always had a performance drop at 4K/VR resolutions due to the architecture.
 
Back
Top