NVIDIA GeForce RTX 4070 Reviews

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,910
The reviews are live and it's looking promising?

1681305019864.png


NVIDIA GeForce RTX 4070 Founders Edition Review

ASUS GeForce RTX 4070 Dual Review

MSI GeForce RTX 4070 Ventus 3X Review

PNY GeForce RTX 4070 Review

Gainward GeForce RTX 4070 Ghost Review

 
If you buy a $700 card with 12gb on it... your stupid.
As Hardward unboxed showed the other days.... 8GB 3000 Nvidia cards are getting handily beat in newer titles by older AMD cards with 16GB of ram in RAY TRACING performance on newer titles. Even without RT enabled some of the old 8gb cards are essentially unplayable at 1080 ultra and in some cases HIGH settings.

12gb right now is setting yourself up for tears. Game developers have hit a point where they have all said fuck it... 16gb is the target now the consoles can handle it and we are not spending a year of development trying to squeeze 8gb decent looking texture packs out, or find invisible load points to swap textures in and out.
 
If you buy a $700 card with 12gb on it... your stupid.
As Hardward unboxed showed the other days.... 8GB 3000 Nvidia cards are getting handily beat in newer titles by older AMD cards with 16GB of ram in RAY TRACING performance on newer titles. Even without RT enabled some of the old 8gb cards are essentially unplayable at 1080 ultra and in some cases HIGH settings.

12gb right now is setting yourself up for tears. Game developers have hit a point where they have all said fuck it... 16gb is the target now the consoles can handle it and we are not spending a year of development trying to squeeze 8gb decent looking texture packs out, or find invisible load points to swap textures in and out.
16gb is not the target for new console. 4gb~ is reserved for OS use.
 
Idk. It just doesn't feel good. The 3070 made a big splash by being "2080ti performance, but for $500!"

Now you get not quite 3080 performance for.....$600. Oh, but you get dlss 3. I guess if you're going to make a ton of use out of that?
 
Whew, thank God I'm buying the $600 FE then, that almost applied to me 😅😮‍💨👍

Edit: It's you're* BTW 👍
If you have a Best Buy credit card, you should be able to use code mar24emob25 at checkout for 10% off. And you should also have an offer for 10% on any purchase made on your Best Buy credit card you can activate. So the FE and the other $600 models will be $540 + tax plus $50 in Best Buy rewards. That's effectively $490.
 
If you have a Best Buy credit card, you should be able to use code mar24emob25 at checkout for 10% off. And you should also have an offer for 10% on any purchase made on your Best Buy credit card you can activate. So the FE and the other $600 models will be $540 + tax plus $50 in Best Buy rewards. That's effectively $490.

Damn I don't - I was gonna just use my PayPal card for 3% off when checking out via PayPal - I don't have any 5% category cards that could apply to the purchase ATM otherwise I'd use them

I could try applying today, but I just got 2 new cards last month so I doubt I'll get approved - won't hurt to try though, credit score will just bounce back eventually from taking the hit of applying

Thanks for the heads up 👍👍👍

Edit: I also mentioned in another thread I'm selling my 3060ti to cover some of the costs too - got a buyer already - so I'm effectively paying ~$373 IIRC with that money applied to the purchase after tax as is
 
16gb is not the target for new console. 4gb~ is reserved for OS use.

Its 2gb PS5 2.5gb Xbox. They both have Ram for background task now.
I notice no one reviewing these today is using any newer titles.

Watch hot hardwards 8gb vs 16gb last gen card comparison. In more recent games AMDs older 16gb cards are using almost 15gb of textures even at 1080p ultra. I notice today no one is using and of the current crop of games that are forcing previously high end 8gb cards to run at medium settings to remain playable. Only a few years on.... and AMDs last gen crap RT cards, are outpeforming AMDs last gen great RT performance cards with RT enabled.


Anyway my only point is... 8gb has clearly become a no go for ultra settings at this point, and even high and medium settings is an issue with some of the latest games. Nvidia has hobbled these 4070 cards with 12gb. It probably is fine right now... I don't believe 12gb is going to age very well.
 
Last edited:
Its 2gb PS5 2.5gb Xbox. They both have Ram for background task now.
I notice no one reviewing these today is using any newer titles.

Watch hot hardwards 8gb vs 16gb last gen card comparison. In more recent games AMDs older 16gb cards are using almost 15gb of textures even at 1080p ultra. I notice today no one is using and of the current crop of games that are forcing previously high end 8gb cards to run at medium settings to remain playable. Only a few years on.... and AMDs last gen crap RT cards, are outpeforming AMDs last gen great RT performance cards with RT enabled.


Anyway my only point is... 8gb has clearly become a no go for ultra settings at this point, and even high settings is an issue with some of the latest games. Nvidia has hobbled these 4070 cards with 12gb. It probably is fine right now... I don't believe 12gb is going to age very well.


even 10gb is starting to be an issue. I feel at this point you need 16gb or more to at least future proof yourself a bit. I'm sitting on a 10gb 3080 and I'm really wanting to upgrade just to get more vram.
 
Its 2gb PS5 2.5gb Xbox. They both have Ram for background task now.
I notice no one reviewing these today is using any newer titles.

Watch hot hardwards 8gb vs 16gb last gen card comparison. In more recent games AMDs older 16gb cards are using almost 15gb of textures even at 1080p ultra. I notice today no one is using and of the current crop of games that are forcing previously high end 8gb cards to run at medium settings to remain playable. Only a few years on.... and AMDs last gen crap RT cards, are outpeforming AMDs last gen great RT performance cards with RT enabled.


Anyway my only point is... 8gb has clearly become a no go for ultra settings at this point, and even high settings is an issue with some of the latest games. Nvidia has hobbled these 4070 cards with 12gb. It probably is fine right now... I don't believe 12gb is going to age very well.

They typically don't review new games though. They want stable games to compare against where they can use historical data rather than re-testing and they also don't want to use games like TLOU because of all their technical issues.
 
They typically don't review new games though. They want stable games to compare against where they can use historical data rather than re-testing and they also don't want to use games like TLOU because of all their technical issues.

The technical issues, don't exist on cards with a proper amount of vram.

For what both companies are asking for mid range cards these days... I want to know its going to actually be able to play a game released NOW. I don't care how well they play 4 year old games like AC Odyssey, I mean my 5700xt can handle that game. I want to know how a new GPU handles games my 3 year old cards are just now starting to stumble on. Its very easy to see multiple newer games that are pushing 15gb in use at 1080. I wanna see those games reviewed before I would consider pulling the trigger on a 12gb cards. (even if they have to add a bunch of *s and list version numbers to the results)
 
If you play at minimum graphics settings on 1080p and never use any user made mods that increase graphics, this card will probably work OK for a normal period of time. Plan on going above 1080p at any point with it? Planning on turning up graphics at any point? Planning on adding user made mods that include graphics uplifts? Anything less than a 16GB card is a useless waste of money.
 
If you play at minimum graphics settings on 1080p and never use any user made mods that increase graphics, this card will probably work OK for a normal period of time. Plan on going above 1080p at any point with it? Planning on turning up graphics at any point? Planning on adding user made mods that include graphics uplifts? Anything less than a 16GB card is a useless waste of money.
That's pretty ridiculous. You're basically saying that if you go nvidia it's 4080/4090 or don't bother.
 
I hope that holds true... newer games are using 15gb of vram at 1440, and the same at 1080 with RT enabled.
I don't play anything released in the last two years - do you mind listing off those more recent titles that are using 15GB of VRAM at 1440p? Not feeling ambitious enough to look them up myself. :whistle:
 
That's pretty ridiculous. You're basically saying that if you go nvidia it's 4080/4090 or don't bother.
Yep, exactly. Nvidia is a waste of money for anything less than 4080 for buying a card in 2023 at New price, unless you plan to play under the conditions I mentioned. AMD (6800 or higher, last gen or this gen) is the better option if you can't afford a 4080. And the 7900 XTX is a better option than the 4080 if you are not using any of the features Nvidia dominates (VR, RT, DLSS, Reflex, GSync Module). I don't like the prices on either side. If my current 2070 Super wasn't having problems, I would have waited yet another generation to see if the VRAM and price situations got any better. Instead I had to get a "discounted" RX 6800 from Micro Center - the very last one they had.
 
Last edited:
I don't play anything released in the last two years - do you mind listing off those more recent titles that are using 15GB of VRAM at 1440p? Not feeling ambitious enough to look them up myself. :whistle:

Hogwarts Legacy. Calisto Project, Last of us Part one, The new resident evil (which hardware unboxed just showed a 8gb 3070 crashing on at 1080p), The new Plague tale game. No doubt there are more... and many more on the way. Everyone of those will use well over 12gb of ram at 1080p ultra and 1440 high.

I would not go as far as to call shenanigan's on all the major reviewers today not including games like Hogwarts which is one of the most popular games going at the moment... I do have to wonder why no one has covered any of this years big eye candy titles. I seem to remember a time when reviewers didn't ONLY look at 2+ year old games in a review for a brand new GPU. Sure perhaps they mention and * hey we used patch X.X.... but to ignore all games released in the last two years seems odd to me.
 
Hogwarts Legacy. Calisto Project, Last of us Part one, The new resident evil (which hardware unboxed just showed a 8gb 3070 crashing on at 1080p), The new Plague tale game. No doubt there are more... and many more on the way. Everyone of those will use well over 12gb of ram at 1080p ultra and 1440 high.

I would not go as far as to call shenanigan's on all the major reviewers today not including games like Hogwarts which is one of the most popular games going at the moment... I do have to wonder why no one has covered any of this years big eye candy titles. I seem to remember a time when reviewers didn't ONLY look at 2+ year old games in a review for a brand new GPU. Sure perhaps they mention and * hey we used patch X.X.... but to ignore all games released in the last two years seems odd to me.
Yet you can play Hotwarts Legacy fine on all low's on a GTX 750TI and still keep 30fps and that only has 1GB VRAM... 20GB of consumed system RAM though.
2060 6GB does fine too keeping to the high 50's low 60's on ultra, again like 20+ GB consumed system ram when playing Hogwarts.
I see a lot more need for system ram there than GPU ram.
 
Last edited:
Yet you can play Hotwarts Legacy fine on all low's on a GTX 750TI and still keep 30fps and that only has 1GB VRAM... 20GB of consumed system RAM though.

You can play almost any game if your willing to turn everything down to low sure. :)
I hope people buying a card that is selling for north of a grand where I live... are going to be ok with having to turn games released at the same time as their new GPU down to medium settings.
I don't know perhaps 12gb will be enough to handle Medium settings in games coming out later this year and next. Its a gamble though.
 
Its 2gb PS5 2.5gb Xbox. They both have Ram for background task now.
I notice no one reviewing these today is using any newer titles.

Watch hot hardwards 8gb vs 16gb last gen card comparison. In more recent games AMDs older 16gb cards are using almost 15gb of textures even at 1080p ultra. I notice today no one is using and of the current crop of games that are forcing previously high end 8gb cards to run at medium settings to remain playable. Only a few years on.... and AMDs last gen crap RT cards, are outpeforming AMDs last gen great RT performance cards with RT enabled.


Anyway my only point is... 8gb has clearly become a no go for ultra settings at this point, and even high and medium settings is an issue with some of the latest games. Nvidia has hobbled these 4070 cards with 12gb. It probably is fine right now... I don't believe 12gb is going to age very well.

Why are people trying to run ultra settings with 8GB cards anyway?
 
Yet you can play Hotwarts Legacy fine on all low's on a GTX 750TI and still keep 30fps and that only has 1GB VRAM... 20GB of consumed system RAM though.
2060 6GB does fine too keeping to the high 50's low 60's on ultra, again like 20+ GB consumed system ram when playing Hogwarts.
I see a lot more need for system ram there than GPU ram.
I have a 750ti. I believe it has 2gb. I should try it though...lol.
 
Back
Top