NVIDIA GeForce RTX 4070 Priced at $600

Nah, just write it off as "bad console port". Just write off every single game that Nvidia cards just happen to be worse at as a bad port or "AMD title". The trick is to act like the only games "worth" playing are heavily ray traced ones.
Heaven forbid they actually have a point, am I right? Have you even seen how disastrous that port is? Maybe, just maybe people should try getting off the "Nvidia bad, leather jacket man bad" train and actually try to look at things subjectively.
8fad1ec02189c70922cc06165fe6e990.jpg

960x0.jpg

4157b30870ecc1fdf508b0cddde11151.png

last.jpg
 
Gamers already have, but Nvidia is trying to play out the AI wank game to see if they can continue to be profitable while waiting to wear out gamers patience. If Nvidia is profitable then they have no reason to lower prices but I have a feeling this AI gamble is going to lose big for Nvidia.

This logic is so bad I can't even. You're looking at it with tunnel vision, meaning you aren't looking past Nvidia and their product launches. RTX 4070 has 50% more ram than who, themselves? The reason products get improved is because their competitors are improving. If you don't offer a better product for that price, than someone else will. The problem is that Nvidia has AMD and Intel as competitors and most people here would write them off as a choice. AMD has been including more ram than Nvidia for nearly a decade and that has paid off as anyone looking to play The Last of US on Nvidia is going to have a bad time. Sure newer more expensive RTX cards have the VRAM but this has been a thing for Nvidia forever. The RTX 4070 will have the same vram as AMD's 6700 XT.


That's why in that very same video the following happens:

At 1080P and 1440P Ultra both showing over 13GB of VRAM used the 4070Ti, a 12GB GPU beat the 7900XT a 20GB GPU in both 1% lows and averages.

At 4K the 4070Ti, again a 12GB GPU only narrowly loses to the 7900XT, but manages to stay ahead of the 16GB cards like the 6950XT, 6800XT, etc. while posting identical lows to the 6950XT showing that VRAM nor memory bus really made a big difference.

Point of my post is that AMD users continue to clutch their pearls on the issue of having more VRAM, and seemingly forget to realize that any game that would pull near their GPU's VRAM limit will ultimately tax the card in question. Any game coming close to actually using, not just allocating, actually using 16-20GB of VRAM would absolutely cripple the 7900XT and XTX since those scenarios would most likely include massive texture mods and/or RT, in fact they'd probably not run all that great with a 4090 either. The other games released this year that were also dubbed the "doom of the 4070Ti" were already patched, and/or were non-issue, since when you actually see how much VRAM these games use now versus when they released, especially at 1440p, as well as 4k. The only saving grace for the 7900XT is it's bigger memory bus, but in most all instances those games run perfectly fine on the 4070Ti. Just take a look at CP2077 and Witcher 3 when using RT, those two favor the 4070Ti by quite a big margin, and both those games you would think would require a fat memory bus, or a ton of VRAM, but CP2077 only utilizes 10.5-11GB at Native 1440P with RT set at Psycho and everything else at max, as for the Witcher 3, it only uses 8.5-9GB of VRAM with everything maxed and RT on.

As I've said in my previous post, the VRAM debacle is way overblown, it's gotten to the point of being nonsensical since equipping something like the 7900XT with 20GB of VRAM only serves as a gimmick and talking point but doesn't actually provide any real benefit since games that naturally use, not allocate, actually use 12GB of VRAM or more already hurt it's performance.

Another game to look at that supposedly require more VRAM than what the 4070Ti offers: RE4 Remake. Game runs at 4K Ultra and 1440P ultra while posting nearly identical results to the 7900XT, in some cases where RT might be heavier the 4070Ti wins out, but the performance lines up and the reported VRAM usage based on the in-game measurement is more than what the 4070Ti has.

To end my rant: the day we get to actually using 12GB or more VRAM on a consistent basis the only cards that'll matter this generation are the 7900XTX if RT isn't involved, the 4080 and 4090, but by this time I'm sure the 50-series/8000-series GPU's will have already been out for awhile--in other words we're not at that point to where properly optimized games are using near 12GB of VRAM without the use of texture mods, that or running 4K Ultra settings with RT enabled, which even the 7900XTX would crash and burn a majority of the time. I will say this though, the 4070Ti should have come equipped with more VRAM just so it could have had a wider bus, not because games actually need more VRAM.
 
As I've said in my previous post, the VRAM debacle is way overblown, it's gotten to the point of being nonsensical since equipping something like the 7900XT with 20GB of VRAM only serves as a gimmick and talking point but doesn't actually provide any real benefit since games that naturally use, not allocate, actually use 12GB of VRAM or more already hurt it's performance.
As far as the R9 290, this has been the debate about VRAM. It's a gimmick until it isn't. We're now starting to see games that at 1080P will go above 8GB of VRAM with max settings and it can make a difference in playability.
To end my rant: the day we get to actually using 12GB or more VRAM on a consistent basis the only cards that'll matter this generation are the 7900XTX if RT isn't involved, the 4080 and 4090, but by this time I'm sure the 50-series/8000-series GPU's will have already been out for awhile--in other words we're not at that point to where properly optimized games are using near 12GB of VRAM without the use of texture mods, that or running 4K Ultra settings with RT enabled, which even the 7900XTX would crash and burn a majority of the time. I will say this though, the 4070Ti should have come equipped with more VRAM just so it could have had a wider bus, not because games actually need more VRAM.
The issue is that Nvidia has been playing games with VRAM for some time. Remember when the RTX 3060 was released with more VRAM than the RTX 3060 Ti? Which BTW Nvidia now has 3060's with 8GB and it isn't just the VRAM Nvidia lowered on those models. Also who cares about the RTX 50 series when we still don't have all the RTX 40 series out yet?
Well that is a bad PC port. And the recent patch seems to have lowered VRAM usage by 1GB, at least according to someone's pre/post patch benchmark. Not too sure how accurate that is. Doesn't mean other games might start using more though.
Bad PC port doesn't change that games are now starting to demand more VRAM. Them lowering VRAM usuage by 1GB makes it so 8GB cards don't cry. It's safe to assume that games will starting using more VRAM.
 
As far as the R9 290, this has been the debate about VRAM. It's a gimmick until it isn't. We're now starting to see games that at 1080P will go above 8GB of VRAM with max settings and it can make a difference in playability.

The issue is that Nvidia has been playing games with VRAM for some time. Remember when the RTX 3060 was released with more VRAM than the RTX 3060 Ti? Which BTW Nvidia now has 3060's with 8GB and it isn't just the VRAM Nvidia lowered on those models. Also who cares about the RTX 50 series when we still don't have all the RTX 40 series out yet?

Bad PC port doesn't change that games are now starting to demand more VRAM. Them lowering VRAM usuage by 1GB makes it so 8GB cards don't cry. It's safe to assume that games will starting using more VRAM.
No doubt games will continue to use more and more VRAM. It's just that we won't start seeing the usage every VRAM doomer is talking about until around the time the 50-series/8000-series come out. Right now the OG's of this movement Forspoken and Hogwarts Legacy are actually somewhat optimized and showing usages well below what the original reported usages were, TLOU will be fixed, not because of the "8GB" cards, but because the game is actually broken right now. Point is that 12GB of VRAM is more than sufficient for 1440p, and 4K on semi-older titles, basically anything released before and or around mid-2022. Putting more VRAM on a GPU that wouldn't be able to utilize it is a waste since when you get to those types of usages the card(s) are going to need DLSS or FSR just to be able to pull playable frame rates, which in turn will just bring the VRAM usage back down anyways.

I feel bad for people with 8GB GPU's, but, it's not like they're assed out either, they can still enjoy newer AAA games at 1440p, but they're going to have to accept that games are progressing, and it'll soon come when 8GB becomes insufficient to run AAA games at 1440p Ultra Quality, let alone with RT enabled, and they're going to have to lean on DLSS if they choose to play at Ultra settings. This is nothing new, new games come out pushing the boundaries, and it's been customary that you're going to have to start dialing the settings down on older hardware at some point with newer games. In the case of the 4060, I honestly believe Nvidia's relying more on DLSS + FG to push it along. FG in itself is a great feature, and DLSS is, to me, leaps and bounds better than FSR, so I don't see this as necessarily a loss, but why they couldn't have upped it to something like 10GB shows that they are, in fact going to rely heavily on DLSS and FG to push this card along for people who want to run Ultra Quality settings. Thing is though, a lot of these newer games coming out look great even at high settings, which tends to use less resources, and looks better than anything the consoles can push out, but people want to use "Ultra" for some reason, and then complain about VRAM usage--it's like hey, use the high settings, you're going to get a better experience than anything the consoles have to offer, and your 8GB card will chug along just fine.

One thing that came to mind while I posted this--the PS5 and Xbox Series X are decently powerful consoles, both have 16GB of shared RAM/VRAM, but how much RAM do the consoles put aside for the OS and other functionality, and how much is actually used for VRAM? My assumption is that these consoles probably won't utilize anything more than 8-10GB of VRAM in even the most extreme use-case scenario i.e 4K 30 FPS with RT enabled, and with consoles being the baseline for game development, it's probably safe to say on the PC side, we're not going to see anything greater than 12GB in most cases for at least another two or three years, unless we want 4K Ultra with RT settings enabled, and those settings are reserved for cards like the 4080/4090, and in some cases the 7900XTX.
 
how much RAM do the consoles put aside for the OS and other functionality, and how much is actually used for VRAM?
Around 3.5gb for the OS and replay features. Then more for the game itself which is another few (or more) gigs before the vram amount is left available. Your post is spot on by the way, in general :).
 
Around 3.5gb for the OS and replay features. Then more for the game itself which is another few (or more) gigs before the vram amount is left available. Your post is spot on by the way, in general :).
You have a source? Or just guessing? XBox Series X has 2.5gb reserved for the OS, the rest the GPU can use for whatever. PS5 has an additional 512mb DDR 4 for the OS as well. PS5 as well as the XBox has fast transfer rates to and from the SSD, loading textures, shaders, etc. on the fly is incredibly fast. PC would need to use more Vram until fast storage is effectively used.

If 12GB cards just released are being pushed by console ports now, how will they perform in a year? Plus one would want higher res textures, more RT, other features for a Port to a PC for better than console visuals, all requiring more Vram.
 
Last edited:
In the case of the 4060, I honestly believe Nvidia's relying more on DLSS + FG to push it along. FG in itself is a great feature, and DLSS is, to me, leaps and bounds better than FSR, so I don't see this as necessarily a loss, but why they couldn't have upped it to something like 10GB shows that they are, in fact going to rely heavily on DLSS and FG to push this card along for people who want to run Ultra Quality settings.
This is why I never liked DLSS and FSR. It has gotten to the point where we're expected to use these features and like it, even though they lower image quality. I have and always will dismiss DLSS and FSR for any new GPU review. These are meant to help with performance issues with Ray-Tracing, and a bonus to those using older GPU's. Not a feature to be utilized because manufacturers are including less than ideal VRAM.
Thing is though, a lot of these newer games coming out look great even at high settings, which tends to use less resources, and looks better than anything the consoles can push out, but people want to use "Ultra" for some reason, and then complain about VRAM usage--it's like hey, use the high settings, you're going to get a better experience than anything the consoles have to offer, and your 8GB card will chug along just fine.
The reason people complain is because they spent an ungodly amount of money only to find that VRAM is their limiting factor because their 8GB card has the same VRAM as my Vega 56 from 2016.
One thing that came to mind while I posted this--the PS5 and Xbox Series X are decently powerful consoles, both have 16GB of shared RAM/VRAM, but how much RAM do the consoles put aside for the OS and other functionality, and how much is actually used for VRAM? My assumption is that these consoles probably won't utilize anything more than 8-10GB of VRAM in even the most extreme use-case scenario i.e 4K 30 FPS with RT enabled, and with consoles being the baseline for game development, it's probably safe to say on the PC side, we're not going to see anything greater than 12GB in most cases for at least another two or three years, unless we want 4K Ultra with RT settings enabled, and those settings are reserved for cards like the 4080/4090, and in some cases the 7900XTX.
The whole reason this VRAM thing was started was because GoldenTiger thinks charging $100 more for more performance and VRAM is perfectly justified. Ignoring that AMD has already provided more VRAM and performance for less. Intel's A770 has 16GB of VRAM and nobody mentions them. I'm sure 12GB of VRAM outta be enough for a while, but that's not the point. Nvidia charging $600 for for the 4070 is a slap to their consumers. The 3070 was overpriced due to crypto. The 2070 was overpriced because of Ray-Tracing that doesn't work. The GTX 1070 sold a lot and the GTX 970 sold like crazy, even being #1 on Steam for a while. Nobody is going to buy the RTX 4070 for $600 because it's better than the RTX 3070, because of course it's better. GPU's have been overpriced for a long time, and we hoped that by now Nvidia saw the light and would start lowering prices. Nvidia saw their stock value is what they saw.
 
Just to add onto this:
Moore's Law is Dead has talked to Unreal devs about the vRAM problem. And they confirm: it's real.
If you're on 8GB vRAM: "expect that you'll be stuck at 1080p gaming soon".


(This is covered inside of 3 minutes on this video).
 
Just to add onto this:
Moore's Law is Dead has talked to Unreal devs about the vRAM problem. And they confirm: it's real.
If you're on 8GB vRAM: "expect that you'll be stuck at 1080p gaming soon".


(This is covered inside of 3 minutes on this video).

Or turn down something a notch and you'll be fine with 1440p still ;). Your 8gb card probably wouldn't be powerful enough to run settings requiring more vram anyway. This whole issue is way overblown and has happened before with every major ram jump.

Also, lol at the video comparing non existent rumors of RDNA 4 vs rtx 5000 :ROFLMAO:.
 
Or turn down something a notch and you'll be fine with 1440p still ;). Your 8gb card probably wouldn't be powerful enough to run settings requiring more vram anyway.
The 4070 as discussed in this thread has a real chance at being stuck at 1080p. I know that Jensen is your Jesus and you're good with plebs spending $600 on a card that can't even do 1440p. But for the rest of us that don't take that position, nVidia is ripping off it's mid-range customers.
This whole issue is way overblown and has happened before with every major ram jump.
That there have been cards left behind and hamstrung every time this has happened? We agree.
Also, lol at the video comparing non existent rumors of RDNA 4 vs rtx 5000 :ROFLMAO:.
MLID operates entirely on industry contacts. If you don't want to listen, that's on you.
 
The 4070 as discussed in this thread has a real chance at being stuck at 1080p. I know that Jensen is your Jesus and you're good with plebs spending $600 on a card that can't even do 1440p. But for the rest of us that don't take that position, nVidia is ripping off it's mid-range customers.

That there have been cards left behind and hamstrung every time this has happened? We agree.

MLID operates entirely on industry contacts. If you don't want to listen, that's on you.
The 4070 has 12gb, it is in no danger of being unable to hold 1440p for its usable lifespan of gpu performance of ultra settings. Please keep the personal insults to yourself, no need to be rude. This isn't class warfare. I couldn't care less what you can or can't afford.

You're making up words I didn't say. These cards do not get hamstrung by vram ultimately in most cases, but performance.

Mlid is a well known "throw everything at the wall and see what sticks" for rumors and sensationalist for "news". P. S. You don't know who I am, I personally am a game dev and know these game requirements first hand. I don't take third hand info and then sensationalize it for youtube clicks. ;)

EDIT: 12gb will be fine while the 4070 has enough gpu grunt to support it. Nvidia isn't just throwing random amounts on cards to tick off customers.
 
Last edited:
The 4070 has 12gb, it is in no danger of being unable to hold 1440p for its usable lifespan of gpu performance of ultra settings.
If it does. We'll see.
Please keep the personal insults to yourself, no need to be rude. This isn't class warfare.
You constantly insult people in your assessment of them.
Mlid is a well known "throw everything at the wall and see what sticks" for rumors and sensationalist for "news". P. S. You don't know who I am, I personally am a game dev and know these game requirements first hand. I don't take third hand info and then sensationalize it for youtube clicks.
Like here for instance where you defame someone directly and claim to know them. Don't pretend you don't do personal insults. You just prefer to be indirect.
You're making up words I didn't say.
k.
 
If it does. We'll see.

You constantly insult people in your assessment of them.

Like here for instance where you defame someone directly and claim to know them. Don't pretend you don't do personal insults. You just prefer to be indirect.

k.
I don't think mlid, a major youtube channel, needs you to white knight for him. Address any of the facts, or are you just going to keep sarcastically trolling?
 
I don't think mlid needs you to white knight for him.
I don't think nVidia needs you to white knight for them.
Address any of the facts,
*refers back to video* If you don't agree with MLID's contacts, I guess we're done?
or are you just going to insult me with more random stuff?
Nothing in there was an insult. Literally all factual. You could make an argument that I did for my first response, but not the second.
Though if you looked at the actual analogy contained there in, I'm not wrong there either. Jensen can do no wrong to you. If you have ever presented a position outside of defending literally everything nVidia does, I have never seen a single criticism in any post from you.
 
I don't think nVidia needs you to white knight for them.

*refers back to video*

Nothing in there was an insult. Literally all factual.
Lol, yeah, saying I'm a raging fanboy of Jensen and have him as my religious idol isn't an insult :rolleyes:. I'm not white knighting Nvidia by providing facts about their video cards and my own opinions. You're angry because I don't agree with you? Then rebut it with evidence.

As far as the video, no one who is in the know takes it any more seriously than Linus tech tips for high level topics. This is hardforum, I think most people expect a higher level of discussion here.

You'll just get the thread locked with your continued uncivil discussion.

EDIT: Address the topic, not the poster.
 
Last edited:
  • Like
Reactions: DPI
like this
Though if you looked at the actual analogy contained there in, I'm not wrong there either. Jensen can do no wrong to you. If you have ever presented a position outside of defending literally everything nVidia does, I have never seen a single criticism in any post from you.
Not sure how much you follow thesr boards but I was a crusader over the 3.5gb gtx 970 thing. Kyle even gave me a custom title over it that was since removed by the title purge in general. Go search GoldenTiger gtx 970 in Google. I'll wait ;).
 
This is why I never liked DLSS and FSR. It has gotten to the point where we're expected to use these features and like it, even though they lower image quality. I have and always will dismiss DLSS and FSR for any new GPU review. These are meant to help with performance issues with Ray-Tracing, and a bonus to those using older GPU's. Not a feature to be utilized because manufacturers are including less than ideal VRAM.

The reason people complain is because they spent an ungodly amount of money only to find that VRAM is their limiting factor because their 8GB card has the same VRAM as my Vega 56 from 2016.

The whole reason this VRAM thing was started was because GoldenTiger thinks charging $100 more for more performance and VRAM is perfectly justified. Ignoring that AMD has already provided more VRAM and performance for less. Intel's A770 has 16GB of VRAM and nobody mentions them. I'm sure 12GB of VRAM outta be enough for a while, but that's not the point. Nvidia charging $600 for for the 4070 is a slap to their consumers. The 3070 was overpriced due to crypto. The 2070 was overpriced because of Ray-Tracing that doesn't work. The GTX 1070 sold a lot and the GTX 970 sold like crazy, even being #1 on Steam for a while. Nobody is going to buy the RTX 4070 for $600 because it's better than the RTX 3070, because of course it's better. GPU's have been overpriced for a long time, and we hoped that by now Nvidia saw the light and would start lowering prices. Nvidia saw their stock value is what they saw.
Awhile back I had made a post on reddit regarding the rumored 7000-series line-up. From the looks of it, AMD isn't changing step--the 7700XT will still have the same 192-bit 12GB setup, and it's supposed to be in-line with the 4070. The 7800XT is where things get spicy, it's a bit slower than the 4070Ti, but offers 16GB of VRAM. Problem with that is if a game is utilizing anywhere near that level of VRAM usage for that tier of card, it's going to be hobbled big time. Benchmarks and such have already shown some games that actually used 12GB of VRAM, neither the 7900XT with 20GB nor 4070Ti 12GB could maintain playable frame rates, let alone the 7800XT.

I'm going to continue to stand by my assertion that this VRAM debacle is overblown. Yes, I know Nvidia could have slapped more VRAM onto their 40-series offerings, not going to deny it, but cards like the 4070 and 4070Ti are staying in their lane by not trying to give a false sense that just because it has more VRAM that it will last longer than it actually will. Yes, the 1% lows will be better, but when a game like CP2077 that only uses around 10.5GB at 1440p, and maybe 11-12GB at 4K both with RT on are already taxing the shit out of a 4090, what chance does any other card really have, regardless of how much VRAM it has, and what good are the 1% lows going to be when you're pulling <30 FPS?

As for the 8GB culprit known as the 4060... it's probably going to be aimed at 1080P high refresh rate gaming, that's the only realistic assumption I could come up with. I didn't find anything on the AMD side that's comparable to the 4060, outside of maybe the 7600XT, which also is going to have 8GB of VRAM. It would honestly make a bit of sense, like the 4070Ti able to do some 4K gaming, it's main intent is more in-line with high refresh rate 1440p gaming, so it would be safe to assume the 4060, which might also be capable of 1440p gaming, to be more of a 1080p high refresh rate GPU, especially if paired with Frame Generation.

This just came to mind--maybe Nvidia's reasoning behind reducing the VRAM has to do with the productivity side of these cards. I'm sure they don't want people who're doing productivity work like machine learning, or work that needs a lot of VRAM just gobbling up a card like the 4070-4070Ti at a such a low price compared to actual productivity cards and would rather push people interested in that kind of stuff along with gaming to their more expensive 4080 or 4090 cards. It's just a theory honestly.

As for how well these cards will sell--they'll probably still be successful and sell quite a lot, not nearly as many as the 30-series for sure, but the 30-series benefitted from the crypto-boom, along with people coming into tons of disposable income when the pandemic hit and the US government was shelling out gobs of money in the form of stimulus checks.

Addendum: Don't get me wrong, I like AMD's offerings. I think they're great offerings and I'm happy to see AMD having their success, but I truly wish they'd focus more on feature sets and better RT performance that would go a long way to helping their cards sell more than just slapping a ton of VRAM on cards that couldn't realistically utilize that much in any capacity. The main reason I opted for a lower VRAM card in the 4070Ti over the 7900XT is things like DLSS, better RT performance, Shader Execution Reordering, Reflex, etc.
 
Golden Tiger is one of the most solid and unbiased posters on this board. I've been here since 2003 and have never seen him insult anyone who didn't have it coming
Technically he ranks up there with the best of them when it comes to the quality of his posts so you sure as heck better bring your A game if you wish to spar with him. 😉
 
This just came to mind--maybe Nvidia's reasoning behind reducing the VRAM has to do with the productivity side of these cards.
Considering they barred AIB vendors from using blower coolers due to the easy usage of those for servers or workstations, I think you're right on with that. It's probably not the only reason, but they do want to protect their workstation card margins.
 
Golden Tiger is one of the most solid and unbiased posters on this board. I've been here since 2003 and have never seen him insult anyone who didn't have it coming
Technically he ranks up there with the best of them when it comes to the quality of his posts so you sure as heck better bring your A game if you wish to spar with him. 😉
I appreciate that :). I try not to delve too into it, but sometimes conversations can get heated.
 
Your 8gb card probably wouldn't be powerful enough to run settings requiring more vram anyway.
I've been seeing this argument for years but with absolutely nothing to back it up and IME it's simply not true. For instance at the beginning of the last crypto craze my 8bg rx 480 died and all I could find near msrp was a 5600xt which is still noticeably more powerful but with 2GB less VRAM I found that I had to lower settings in several games because performance had tanked, in other less vram demanding games I was able to turn settings up. The fact that performance does tank instead of gracefully tapering off is also why I try to avoid that issue.

Now when I got that 480* most people were recommending the 4GB version or the 3GB version of the 1060 in that price range and they were using the same argument that by the time the extra memory is needed the cards will be too weak. What actually happened though was the 3GB card ran into issues almost immediately and the 4GB not long after. More recently I saw all of those cards benched in a current release and the 8GB 480 was still doing quite well at a lower resolution, the 6GB 1060 was playable but significantly slower at the same settings, and the 3-4GB cards were both at single digit framerates.

*I couldn't stomach the price on the 1080 so I went cheap as a stopgap, in hindsight I should have grabbed a 1080 with a good warranty.
 
Golden Tiger is one of the most solid and unbiased posters on this board. I've been here since 2003 and have never seen him insult anyone who didn't have it coming
Technically he ranks up there with the best of them when it comes to the quality of his posts so you sure as heck better bring your A game if you wish to spar with him. 😉
He is a lot of things, but unbiased towards nVidia he definitely is not. In 20 years he has not sided with them one time. You can only make that statement if you agree with his nVidia bias, but not because he’s unbiased.
 
Gamers already have, but Nvidia is trying to play out the AI wank game to see if they can continue to be profitable while waiting to wear out gamers patience. If Nvidia is profitable then they have no reason to lower prices but I have a feeling this AI gamble is going to lose big for Nvidia.

This logic is so bad I can't even. You're looking at it with tunnel vision, meaning you aren't looking past Nvidia and their product launches. RTX 4070 has 50% more ram than who, themselves? The reason products get improved is because their competitors are improving. If you don't offer a better product for that price, than someone else will. The problem is that Nvidia has AMD and Intel as competitors and most people here would write them off as a choice. AMD has been including more ram than Nvidia for nearly a decade and that has paid off as anyone looking to play The Last of US on Nvidia is going to have a bad time. Sure newer more expensive RTX cards have the VRAM but this has been a thing for Nvidia forever. The RTX 4070 will have the same vram as AMD's 6700 XT.


Not necessarily true about 8GB GPUs, check out this video and notice the huge difference between 16GB and 32GB of system RAM for The Last of Us. How many are rocking 16GB of system RAM? I upgraded to 32GB when I swapped out CPU for a 3800X, had 16GB previously. Even the new 24GB RAM kits will be close to struggling.

Edit: Also look at Hogwarts Legacy, same issue with 16GB of RAM, stuttering.
 
Last edited:
Absolutely! I was burned out on pc gaming for the past few years until this purchase. The 4090 has transformed my gaming experience to a level I didn't think possible. Games that were just ok at 1440p on a 27in monitor are now much more enjoyable to me due to the added immersion of NV surround and the high refresh rates afforded by the 4090. The best games are made even greater because everything is amplified to the 10th degree at 4320x2560@165hz. It's gaming bliss and I'd wish I'd done it sooner.....
I've always found games to be the deciding factor of my gaming enjoyment, not frames per second or visual settings.
 
I've always found games to be the deciding factor of my gaming enjoyment, not frames per second or visual settings.
Good for you! You have your methods for your own enjoyment and I have mine, as it should be when it comes to a hobby that is so subjective.
 
Not necessarily true about 8GB GPUs, check out this video and notice the huge difference between 16GB and 32GB of system RAM for The Last of Us. How many are rocking 16GB of system RAM? I upgraded to 32GB when I swapped out CPU for a 3800X, had 16GB previously. Even the new 24GB RAM kits will be close to struggling.

Edit: Also look at Hogwarts Legacy, same issue with 16GB of RAM, stuttering.
That's not shocking at all, especially if you need a ton of assets loaded into ram to send to the GPU. I've been using 32 for years for game dev. My next setup will definitely have 64.
 
Oh, it didn't. I just decided not to entertain the argument since it has racist connotations ;).
I thought I was the woke one, TIL. Pretty tame joke tbh, also you don't want to entertain it because i'm not wrong.
 
Last edited:
Awhile back I had made a post on reddit regarding the rumored 7000-series line-up. From the looks of it, AMD isn't changing step--the 7700XT will still have the same 192-bit 12GB setup, and it's supposed to be in-line with the 4070. The 7800XT is where things get spicy, it's a bit slower than the 4070Ti, but offers 16GB of VRAM. Problem with that is if a game is utilizing anywhere near that level of VRAM usage for that tier of card, it's going to be hobbled big time. Benchmarks and such have already shown some games that actually used 12GB of VRAM, neither the 7900XT with 20GB nor 4070Ti 12GB could maintain playable frame rates, let alone the 7800XT.
That depends on the game as I've said it can make the difference of being playable. If Intel can include 16GB of VRAM on their A770 which costs much less than $400, then Nvidia can at least include that much on their $600 cards.
I'm going to continue to stand by my assertion that this VRAM debacle is overblown. Yes, I know Nvidia could have slapped more VRAM onto their 40-series offerings, not going to deny it, but cards like the 4070 and 4070Ti are staying in their lane by not trying to give a false sense that just because it has more VRAM that it will last longer than it actually will.
Nobody goes by VRAM to determine what the GPU can do. Nvidia is just creating planned obsolescence, because you either lose a significant amount of performance or lower your imagine quality settings to fill the VRAM properly.
Yes, the 1% lows will be better, but when a game like CP2077 that only uses around 10.5GB at 1440p, and maybe 11-12GB at 4K both with RT on are already taxing the shit out of a 4090, what chance does any other card really have, regardless of how much VRAM it has, and what good are the 1% lows going to be when you're pulling <30 FPS?
Cyberpunk 2077 is a 3 year old game, and many new games are far more demanding.
As for the 8GB culprit known as the 4060... it's probably going to be aimed at 1080P high refresh rate gaming, that's the only realistic assumption I could come up with.
A GTX 1060 is already fine at 1080P gaming. We don't need more GPU's fine at 1080P gaming.
I didn't find anything on the AMD side that's comparable to the 4060, outside of maybe the 7600XT, which also is going to have 8GB of VRAM.
If they do then I'll be here to put them down just like I've done with Nvidia.
This just came to mind--maybe Nvidia's reasoning behind reducing the VRAM has to do with the productivity side of these cards. I'm sure they don't want people who're doing productivity work like machine learning, or work that needs a lot of VRAM just gobbling up a card like the 4070-4070Ti at a such a low price compared to actual productivity cards and would rather push people interested in that kind of stuff along with gaming to their more expensive 4080 or 4090 cards. It's just a theory honestly.
If Nvidia could sell more cards by adding VRAM, they would have done it already. Nvidia doesn't want to give customers longevity with their products.
As for how well these cards will sell--they'll probably still be successful and sell quite a lot, not nearly as many as the 30-series for sure, but the 30-series benefitted from the crypto-boom, along with people coming into tons of disposable income when the pandemic hit and the US government was shelling out gobs of money in the form of stimulus checks.
Other than the 4090, I doubt any 40-series cards is selling well. The 30-series sold well because of crypto and people being locked home.
Addendum: Don't get me wrong, I like AMD's offerings. I think they're great offerings and I'm happy to see AMD having their success, but I truly wish they'd focus more on feature sets and better RT performance that would go a long way to helping their cards sell more than just slapping a ton of VRAM on cards that couldn't realistically utilize that much in any capacity. The main reason I opted for a lower VRAM card in the 4070Ti over the 7900XT is things like DLSS, better RT performance, Shader Execution Reordering, Reflex, etc.
I don't care what you buy. AMD needs to learn a financial lesson as well, but for different reasons. Personally I don't care for Ray-Tracing, just because it seems useless. Half of Nvidia's GPU die is used for Ray-Tracing, which is a huge waste of silicon. So much so that Nvidia invented all sorts of gimmicks to make good use of these useless features. We fake graphics so well that Ray-Tracing isn't really needed. The more demanding games get, the worse the Ray-Tracing performance will be. If I did care about Ray-Tracing then Intel does a really good job at it. Something like Shader Execution Reordering is just you reading off of a wiki. DLSS is neat, but really meant to fix bad Ray-Tracing performance.
Golden Tiger is one of the most solid and unbiased posters on this board. I've been here since 2003 and have never seen him insult anyone who didn't have it coming
Technically he ranks up there with the best of them when it comes to the quality of his posts so you sure as heck better bring your A game if you wish to spar with him. 😉
Everyone has a bias, and anyone who tells you they don't are biased.
 
I'm gonna be part of the problem and get this on launch - partly due to Nvidia stock rising so drastically I might as well get something out of it - partly due to all the 8GB VRAM problems we've seen lately

I have a buyer for my 3060ti lined up (needs to upgrade from his 970FTW anyway, and also so he can render some AI art locally/offline), so that money put towards this + credit card cash back should be able to get this for ~$373 in the end

Edit: because this way I stick to my own rule of never paying more than $525 for a card :^)
 
Last edited:
So if you sell your car and pick up a 5090 on launch you're still not paying more than $525 for a card? :)

Yeah, but I really shouldn't have ever had the 3060ti to begin with.

I had my 2070 and would have held onto that until now/the 4070 like I normally do (I normally upgrade every other gen), but I got greedy/stupid and sold it in Aug 2020 thinking "Oh I'll just pick up a 3070 at launch". Because I was doing a whole system rebuild then (3570k > 5950x).

But PC Apocalypse came along and it was a tough enough time getting my 5950x @ MSRP - I wasn't paying over MSRP for a GPU or camping inventory streams again like I did for my 5950x (which was really the priority cause I needed to get 4K UHD encodes underway that were waiting and clogging up all my spare HDD space, plus with my 2070 I was noticing CPU bottlenecking in some games). I also missed out on being able to sell the 2070 for double what I did.

I had a friend lend me his 1080 FTW to tide me over until I was able to get the 3060ti for MSRP just last April (fucking lol). Aside from just wanting to get that back to him, I really wanted to play Control finally (w/DLSS and ray tracing) and the Resident Evil 2/3 ray tracing updates were about to hit.

Plan then was to keep the 3060ti until 5k series, but now with the 8GB VRAM issue going around - and it's been affecting me in the RE games (though I stand by this is shit/lazy optimization and you should release games to the market that exists - even with Nvidia sharing some of the blame, as I've stated in the RE4 thread) - so I feel my hand is forced and I'll go with the 50% increase in VRAM and 4070 as was the original-original plan.

And until we see how Murphy's Law mucks this one up, the new-new plan is hold onto the 4070 until the 6k series now -_- Back on track in a way I suppose.

However, I'm not selling/shipping the 3060ti until I have the 4070 in hand this time. Some lessons have been learned at least :)
 
thinking "Oh I'll just pick up a 3070 at launch".

:)
yup quite a few of us were leaning into this kind of mindset... how were we supposed to know that the fundamental rules of how commerce work would change so drastically during that time. Luckily for me I didn't sell my GPU, unlucky for me I'm still stuck with a gtx 970 because I refuse to spend day 1 pricing for 2.5 year old technology that is already being replaced by next gen stuff. I am kind of peaking into the AMD arena though since those AIB partners seem a little more reasonable with reducing pricing on old cards.
 
I'm going to continue to stand by my assertion that this VRAM debacle is overblown. Yes, I know Nvidia could have slapped more VRAM onto their 40-series offerings, not going to deny it, but cards like the 4070 and 4070Ti are staying in their lane by not trying to give a false sense that just because it has more VRAM that it will last longer than it actually will. Yes, the 1% lows will be better, but when a game like CP2077 that only uses around 10.5GB at 1440p, and maybe 11-12GB at 4K both with RT on are already taxing the shit out of a 4090, what chance does any other card really have, regardless of how much VRAM it has, and what good are the 1% lows going to be when you're pulling <30 FPS?
Nvidia ended up giving the 3060 more RAM than the 3070 precisely because they were cutting things unnecessarily close with their RAM. So not really agreeing with the "overblown" part.
 
Nvidia ended up giving the 3060 more RAM than the 3070 precisely because they were cutting things unnecessarily close with their RAM. So not really agreeing with the "overblown" part.
It more likely was due to the bus width and not wanting to only give it 6gb.
 
So you just wouldn't use your pc without a 4090?
True we are avid league bowlers and they just opened a Main Event here, guess what bowling cost? 80 USD, insane my son said it was cheaper to get a hour of arcade gaming. Cost of my 3080 purchase was1100/365days is 3 dollars a day.
 
Back
Top