NVIDIA Readying 10GB, 12GB, and 16GB Variants of RTX 4070

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
Pretty nice

"The billion-dollar question hence would be exactly how NVIDIA segments the three SKUs—whether it stops at just memory size and memory bus width; or whether it will also tinker with core-configuration, and possibly even base one of the three SKUs on a physically smaller silicon, such as the AD106. The "RTX 4070" rumored to be a single SKU until now, was expected to be carved out of the AD104, with 5,888 out of 7,680 CUDA cores being enabled. NVIDIA faced stiff criticism from the media and gamers for such segmentation for its RTX 4080 16 GB and RTX 4080 12 GB, with the company being forced to cancel the market-release of the latter, and relaunch it under the name RTX 4070 Ti."

Rc1eZTXtQxBATUzB.jpg


Source: https://www.techpowerup.com/304976/...-16gb-variants-of-rtx-4070-gigabyte-thinks-so
 
Not a good thing honestly. We already have a 4070ti. The 4070 should be 12GB, not 10GB. If they're planning both, the 10GB will be a crappy stripped down version that is priced like a regular **70, yet performs worse. The regular **70 needs 12GB. It has been on 8GB since the GTX 1070.

As for a hypothetical 16GB, it will still probably be slower than the 4070ti which has 12GB. So kind of pointless.

If this is true then this is just Nvidia trying to bump the price up for the type of performance we expect from the $400-500 range.
 
Not a good thing honestly. We already have a 4070ti. The 4070 should be 12GB, not 10GB. If they're planning both, the 10GB will be a crappy stripped down version that is priced like a regular **70, yet performs worse. The regular **70 needs 12GB. It has been on 8GB since the GTX 1070.

As for a hypothetical 16GB, it will still probably be slower than the 4070ti which has 12GB. So kind of pointless.

If this is true then this is just Nvidia trying to bump the price up for the type of performance we expect from the $400-500 range.
There are some very very small use cases where you would want 16GB on a 4070, and those exist solely for people using the creator drivers and not the game-ready ones where you really need the additional VRAM but not necessarily the GPU processing as much, but they are niche AF, but certainly cheaper than whatever their equivalent workstation card would be, and if you don't need the certified drivers then are a viable option, but nobody whose only goal is gaming should give them a thought.
 
This is definitely good for us, but ONLY if Intel can step up their game, or a fourth-rival as well. I would indeed always like to know if there are any fourth or fifth competitors in the GPU space that are using more optimized, modern methods to immediately create effective graphics architectures that are native to the most advanced fabrication node.

As for all this VRAM, well, this is clearly more about machine learning or stable diffusion type stuff, rather than just merely gaming. A gamer actually would only want the 8GB or 10GB model. It's only when you use applications that specifically utilize vast amounts of VRAM, such as art creation, 3D modeling, or even CAD engineering applications, that it becomes true that the more VRAM you have, the higher resolution and detail level you can put into your creations.

But once you've filled up the VRAM with all the data you need, such as a video game's graphics, then it's not like more will affect anything. With something like a video game, having more VRAM doesn't mean the game will use it. It's got what it needs.

After that, only speed matters.
 
But once you've filled up the VRAM with all the data you need, such as a video game's graphics, then it's not like more will affect anything. With something like a video game, having more VRAM doesn't mean the game will use it. It's got what it needs.
The biggest game of the year so far is showing the limitations of 8-10gb, even at 1440p. So yes, what I have been told are """enthusiast""" grade cards should probably have more than that.
 
For AI image creation I've easily gone over the limits of 12GB Vram in my 3080. Forcing me to use workarounds like creating images at a smaller size (sub 1024x1024) and upscaling them in post.
Next card is gonna be 16 or hopefully 24.

So reasons do exist.
 
For AI image creation I've easily gone over the limits of 12GB Vram in my 3080. Forcing me to use workarounds like creating images at a smaller size (sub 1024x1024) and upscaling them in post.
Next card is gonna be 16 or hopefully 24.

So reasons do exist.
Defending 8gb of vram on $500+ cards last generation was bad enough. Now we're getting ready to do it this gen, too. This time with a gimped bus to go along with it, lol. Going to have to rely on dlss3/frame gen to do everything.
 
As for all this VRAM, well, this is clearly more about machine learning or stable diffusion type stuff, rather than just merely gaming. A gamer actually would only want the 8GB or 10GB model.
VR I think will change this. Of course, that's a complex hardware game but the bottom line is where the average gamer wouldn't need more than a 4K monitor, a "mainstream" VR headset is going to need 8K just to do 4k in each eye. I've thrown 8K textures down in VR and they do make a difference if you have a quality headset.
Not saying we are there yet but it's one of those areas, like power supplies, where the "traditional" limits are getting tossed out the window a bit sooner than one might expect.
 
Hogwarts Legacy. 8gb 3070s aren't cutting it very well. 10 sort of is on 3080s and up. There's some speculation that it's just a buggy game, but higher vram cards are handling the game much better in general.
Ah yes that game has sold very well! I can't believe that I forgot about it ;) Thanks for reminding me.
 
Ah yes that game has sold very well! I can't believe that I forgot about it ;) Thanks for reminding me.
It's not hard to miss if you literally have no interest. I would have if it wasn't for my wife, lol. Tbf, some of it seems to be a bias towards AMD cards which just happen to have higher vram, but the reported numbers of vram usage are up there. AMD control panel shows it drawing over 12gb (utilized, not allocated) @4k on my 6800xt.
 
The question we need to ask is why AMD can do 24Gig on a 7900XT and why Ngreedia gimps us on memory?
 
More price brackets mean consumers can talk themselves into spending more for the next tier up, which would have probably been priced a tier down without the cheaper card holding it up. A strategy apple has implemented for quite some time.
 
This blows honestly. We can’t just call these 4060/ti or whatever. Nope definitely needs to be 4070 for pricing reasons…. It’s the 1060 6g/3g , 2060 6g/12g , 3060 8g, 12g shit again.
 
People accept it. Last gen it was "but the memory bus!". This gen is...idk. Dlss3 and frame gen, I guess. Not even hardware.
Well, the way nvidia is doing GPU's mean more VRAM is far more costly compared to the current AMD architecture. This is where I think AMD is going to really top nvidia over the next few years until nvidia can devise a new architecture so more VRAM isn't so costly.

However, i'm not sure VRAM requirements will continue to go up that much. A lot of what you're seeing with these late-stage UE4 games using so much VRAM is really just the engine being pushed beyond reasonable limits. UE5 handles these large scales scenes far better, and the way it handles textures means you don't hit these VRAM walls.
 
What good does it do to offer a SKU that's so splintered? I'm sure someone can explain the business rationale, but God, somebody needs to dethrone Nvidia.
 
The biggest game of the year so far is showing the limitations of 8-10gb, even at 1440p. So yes, what I have been told are """enthusiast""" grade cards should probably have more than that.
If you are referring to that Techspot article measuring memory usage it’s strange that 4K and 1080p use the same amount of VRAM.
This seems to be an issue with how texture streaming budgets are allocated and it’s unknown if it is a game issue from the porting process or negligence. There is guaranteed a current GPU memory leak issue there but users have found massive improvements in playability for both Red and Green teams in modifying the Engine.ini file.
If you go to the very bottom and add:

[SystemSettings]
r.TextureStreamin=1
r.Streaming.Poolsize=X

Where X is your VRAMx0.8 in MB base 1000 not 1024.

It doesn’t fix the leak but it drastically improves playability for both.

TLDR;
The game is a shitty port job.
 
If you are referring to that Techspot article measuring memory usage it’s strange that 4K and 1080p use the same amount of VRAM.
This seems to be an issue with how texture streaming budgets are allocated and it’s unknown if it is a game issue from the porting process or negligence. There is guaranteed a current GPU memory leak issue there but users have found massive improvements in playability for both Red and Green teams in modifying the Engine.ini file.
If you go to the very bottom and add:

[SystemSettings]
r.TextureStreamin=1
r.Streaming.Poolsize=X

Where X is your VRAMx0.8 in MB base 1000 not 1024.

It doesn’t fix the leak but it drastically improves playability for both.

TLDR;
The game is a shitty port job.
I'm only referencing my own observed performance. But I've also seen people say that those ini tweaks have not fixed the issues. At the end of the day the game still gobbles vram and 8gb was already pushing it on a few other games with high res textures.

*Edit* Not to mention I'm sure we'll run into buggy games in the future. If I can brute force good gameplay from having a bunch of vram, so be it. Beats waiting for fixes, honestly.
 
I'm only referencing my own observed performance. But I've also seen people say that those ini tweaks have not fixed the issues. At the end of the day the game still gobbles vram and 8gb was already pushing it on a few other games with high res textures.

*Edit* Not to mention I'm sure we'll run into buggy games in the future. If I can brute force good gameplay from having a bunch of vram, so be it. Beats waiting for fixes, honestly.
Yeah it’s not a guaranteed fix, but the fact that by default the game tries to load 100% of the game textures to VRAM then overflows to system ram with no prioritization to what textures are currently needed is a big problem.

I suspect the game was straight up sabotaged by angry developers. Hopefully in a year or so when the hate train dies down and it has a few patches under its belt I can snag it on a Steam sale for peanuts.
 
Defending 8gb of vram on $500+ cards last generation was bad enough. Now we're getting ready to do it this gen, too. This time with a gimped bus to go along with it, lol. Going to have to rely on dlss3/frame gen to do everything.

That is how I feel as well. A number of games are pushing high 6s and 7s, with a few running out of VRAM for me like FC6 and WDL. You can of course turn on DLSS, but sometimes the implementation isn't good. DLSS shouldn't start becoming a requirement for the latest games.
 
Going to have to rely on dlss3/frame gen to do everything.
I am not so sure how much dlss 2 and even less 3 would help here, if we talk about vram, I am not sure how well tested-testable it is and it could be a title by title affair-resolution affair:

vram.png


Defending 8gb of vram on $500+ cards last generation was bad enough.
Well even on Hogwarts a 3070 beating a 2080TI at 1440p medium TAA High and it is already too high of a setting:
https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/

8gig 3060ti beat a 12gig 6700xt, 3080 10gb beat a 16gb 6800xt, etc....

the 8 gig card certainly lack vram at 4k-high setting, but it would have been unplayable even if they were 16gig cards, the 12gb 6700xt doest not do better.

Is there yet any game where 8gb of vram on a 3070 has been an issue (would have been a nice 65-70 fps experience with high enough low would it not be of the 8gb of vram, the 2080TI does it for example).

There some a 12gb 3070 would have done 40 fps instead of 25 scenarios, but do they matter ?
 
I like just about everything that techpowerup does but their vram usage charts are just nonsense compared to what I see in the majority of games. I typically blow well past the vram usage that they show sometimes even by two or three gigs.
 
Oh great, another shady Nvidia decision! I wonder if they'll be pushed like the "4080 12gb" version was and name them differently/properly if there's enough backlash. Looking at the differences not just in VRAM but also the other hardware between the "real" 4080 16GB and the now 4070 Ti 12GB version, I am guessing these "4070" models will vary across the board. I see very little reason for a 16GB version outside of the aforementioned niche in some production/creative situations or extremely edge case gaming (ie you're using massive texture pack mods), so I'm not sure the reason to have another confusing "Well this one has more VRAM but the Ti only has 12GB which should I get?" situations. Leaving the regular 4070 on 12GB would be a step up from past generations at least, but I have to wonder just how badly the 10GB will be cut down vs the "real" 4070 12GB and the 4070 Ti 12GB? WIll the be forced to pull yet another oops and move it down to a 4060 Ti or 4060? Its just getting exhausting and I'm tired of Nvidia continually pulling this kind of garbage (not to mention all the proprietary software, spec, or whatever their focus is) while also being treated as the "default" GPU.

I admit i'm biased thanks to a 3090, but at least on the high end I can't imagine going down to a lower VRAM card if you do even a hint of either aggressive (4K, VR , modded etc) gaming plus a little creative or production work; just messing around with Stable Diffusion or encoding for OBS or PeerTube/OwnCast its nice to have the extra VRAM. I'm also quite glad that AMD answered smartly again this generation by ensuring the 7900XTX has 24GB and even the XT is only down to 20. I know this sort of thing won't be required for gaming overall and its somewhat niche but especially if GPU manufacturers keep upping the prices it seems insane to have the kind of restricted VRAM not to mention other components that Nvidia seems focused upon providing despite naming the card at a similar tier
 
I like just about everything that techpowerup does but their vram usage charts are just nonsense compared to what I see in the majority of games. I typically blow well past the vram usage that they show sometimes even by two or three gigs.
I think its a scenario of too much control. If I only used a single monitor, no other active programs, etc. I'm sure my VRAM would be similar. But that's not how I use my computer, I have 3 monitors, all 3 have stuff active and one of the things active is usually a 4k video of some sort. (I game and watch stuff simultaneously) which obviously uses WAY more VRAM than just a single focused game. Just sitting at my desktop with a youtube video on my other screen i'm using a few gigabytes.
 
Yeah that won't be confusing as fuckall, "why is that 4070 more expensive than the other, fuck it buy the cheaper one it's like $200 cheaper!"
 
Wonder what the excuse, and real reason, are for releasing so many

Like throwing spaghetti at the wall
 
Wonder what the excuse, and real reason, are for releasing so many

Like throwing spaghetti at the wall
So when people see the chipset numbers differ for all 3 versions, they'll simply rename them 3070, 3060ti and 3060 for the (16, 12 and 10GB respectively)
 
If they wanted to use roughly the same core but change vram im not sure why they dont just use TI, Super, and 4070 as the naming convention.

Seems like they are trying harder to get people to spend more on a cut down card by burying part of the specs.
 
So when people see the chipset numbers differ for all 3 versions, they'll simply rename them 3070, 3060ti and 3060 for the (16, 12 and 10GB respectively)
I am expecting something like 4070 12gb, 4060 ti 10gb, 4060 16gb
 
I'm only referencing my own observed performance. But I've also seen people say that those ini tweaks have not fixed the issues. At the end of the day the game still gobbles vram and 8gb was already pushing it on a few other games with high res textures.

*Edit* Not to mention I'm sure we'll run into buggy games in the future. If I can brute force good gameplay from having a bunch of vram, so be it. Beats waiting for fixes, honestly.
You're not the only one. Hardware Unboxed showed actual memory in use and 8B was a hinderance. That game used a ton of memory at release. Probably because the console has 16GB on it. But when I got the game 8GB was the absolute minimum in some open area spots it was very obvious I was going over.
 
Defending 8gb of vram on $500+ cards last generation was bad enough. Now we're getting ready to do it this gen, too. This time with a gimped bus to go along with it, lol. Going to have to rely on dlss3/frame gen to do everything.
I think that's partly the point to get playable performance out of them. All these cards are dead on arrival as far as I'm concerned. The 40 series stack/pricing is a big shit sandwich - and just enough people are going to buy these things for NVDA to justify this as the new norm so we all have to take a bite. $500 for a midrange GPU? HAH! It's now $900, suckers...

FFS, I might as well buy AMD Pro GPUs at these prices... *looks at system specs...*


Oh, wait...
 
*yawn*...

Get back to me when they release the RTX 4050 for $399...I'll get excited at that point



:ROFLMAO:
 
Back
Top