NVIDIA GeForce RTX 4070 Reviews

Don't believe big reds lies, the cards are based on the same die
I have zero control over that. What I do have control over is my wallet. If i can find an XTX for a good price, which seems to be getting easier, I'll pull the trigger on it.
 
Need a fact check on this graphic, not sure where it originally comes from:

View attachment 565000
Not sure the percentages, but sounds about right.

970 was roughly equivalent to a 780 Ti - for $329
1070 was equivalent performance to a 980 Ti / Titan X - for $379
2070 was a bit worse than a 1080 Ti - for $499
2070 Super was roughly equivalent or beat a 1080 Ti - for $499
3070 was equivalent performance to a 2080 Ti - for $499
4070 is roughly equal to, but can still lose to a 3080 (regular, not Ti) - for $599

So not only is the 4070 not even matching the performance characteristics of previous 70-class branded GPUs, Nvidia has the gall to also charge you $100 extra minimum, for it compared to those. x70 cards for many generations now have always met or beat the previous generation flagship. Here it trades blows with a 3080. Big whoop.

4070 and 4070 Ti are misbranded. 4070 Ti should be the 4070 (for 4070 prices), and the 4070 is really a 4060 Ti...for $600.

EDIT: Just checked and local MC is filled to the brim with 4070's still. Hell tons of 4090's readily available too.
 
Last edited:
Not sure the percentages, but sounds about right.



EDIT: Just checked and local MC is filled to the brim with 4070's still. Hell tons of 4090's readily available too.
They should Nvidia front loaded the launch, that is probably the entire supply of them for the next calendar year. TSMC has had to do some weird things with the N4 production schedule to accommodate Apple because TSMC had to push back the N3 run to fix issues so Apples summer/fall production is happening how and they hope to put the N3 back for fall which screws up their numbers for the October announcements but that’s a whole other thing.

Point is all the 4070’s you see have to last well into 2024.
 
Its not just crashing on 3070 8gb its also textures popping in and out in multiple games. Its not just one game.
Yes of course you can just reduce the quality settings and use less vram. The point is... Nvidia is selling cards with just enough ram to get them through 1-2 years of use at quality settings people would expect to be able to use. Yes a 3070 isn't a 90 class card.... but Nvidia is selling these "mid" range cards for what we used to pay for the halo products. Cards people are going to have to drop settings to medium in some games within a year of a GPUs launch. I don't know I don't ever wanna pay that much for a GPU that can't handle at least high settings a year out. But everyone will have to make their own value judgements on that one.

As for the CRAP PORT... shitty developer excuses. That is NOT the case. If you haven't been paying attention... games have taken a jump up in terms of texture quality the last year or so. Most Nvidia buyers IMO will expect the 4070s they paid a lot of money for will be able to handle at least high settings for a year or so. When they can't and they stutter, texture pop, or crash... sure people will yell OPTIMZE this, and complain about bad ports. Instead of getting annoyed with Nvidia who is selling GPUs into the higher mid range they know full well are going to run into those problems. Nvidia works very closely with developers they are fully aware 12GB is not enough anymore. 12gb belongs on 60 cards not 70. These should have had a min of 16gb. When a PS5 can access 15GB of texture ram... why should game developer jump through a ton of hoops to try and squeeze their high quality texture packs into 12GB. At the least I hope more developers start making it very clear to gamers what to expect, They should be putting in their min system requirements High quality textures requires 12gb min and ultra quality requires min 16gb. (not that many gamers actually look at requirement lists)
Texture loading/unloading is one of the core functions of a game engine.. 'shitty developer excuses' uh, I would say that is a proven fact. Case in point, the majority of games that function just fine on X amount of vRam. Cyberpunk 2077 is 100Gb install, they sure as hell can't load everything into vRam all at once!! Oh no! GPU obsolete and performance woes oh my!

(not)

Developers/Engines know how to do this, fitting it into the available hardware (which can be 1000 different things on a PC compared to a console) is a requirement, or you get shit performance just like we've seen.

I like how you completely ignored that the issue with Last Of Us was fixed, 8Gb cards play fine now.

PS5 has 16Gb total ram, a pc with a 3080 has 8Gb higher speed vRam, and 16 to 64Gb ram. It's MORE POWERFUL with MORE RAM, But your dev can't make it work because reasons?? Why are you making excuses for them.
to be fair the vast majority of games that are having these VRAM issues are AMD sponsored titles- RE4 remake, LoU Part 1 etc...you don't see these issues much with Nvidia sponsored games (which are the vast majority) or games that aren't sponsored by either
How many games even have this issue??

If that's true, it sounds like AMD is purposely hurting competitor performance with whatever the AMD sponsored game program is atm... same thing Nvidia was accused of with hairworks.

Someone fact check that. If true, forums will have a meltdown...

Seriously tho, games can run on 8Gb just fine. Especially with rebar, makes for more efficient Ram to vRam transfers. "but PS5 doesn't have Rebar so I can't expect games developed for consoles to play for shit without just having all the ram, for the things..." lame.
 
Texture loading/unloading is one of the core functions of a game engine.. 'shitty developer excuses' uh, I would say that is a proven fact. Case in point, the majority of games that function just fine on X amount of vRam. Cyberpunk 2077 is 100Gb install, they sure as hell can't load everything into vRam all at once!! Oh no! GPU obsolete and performance woes oh my!

(not)

Developers/Engines know how to do this, fitting it into the available hardware (which can be 1000 different things on a PC compared to a console) is a requirement, or you get shit performance just like we've seen.

I like how you completely ignored that the issue with Last Of Us was fixed, 8Gb cards play fine now.

PS5 has 16Gb total ram, a pc with a 3080 has 8Gb higher speed vRam, and 16 to 64Gb ram. It's MORE POWERFUL with MORE RAM, But your dev can't make it work because reasons?? Why are you making excuses for them.

How many games even have this issue??

If that's true, it sounds like AMD is purposely hurting competitor performance with whatever the AMD sponsored game program is atm... same thing Nvidia was accused of with hairworks.

Someone fact check that. If true, forums will have a meltdown...

Seriously tho, games can run on 8Gb just fine. Especially with rebar, makes for more efficient Ram to vRam transfers. "but PS5 doesn't have Rebar so I can't expect games developed for consoles to play for shit without just having all the ram, for the things..." lame.

So you are saying they have had a patch in the last 9 days.... perhaps Hardware unboxed should update if that is the case. Did they patch the exact same issue in Hogwarts that was apparently in game 9 days ago ? As well as forespoken, calisto, and plague tale ?


You misunderstand how the PS5 works completely. I am not suggesting its more "powerful" It does however have more Vram then 8gb card, and more then a 12gb card as well. Consoles are not PCs even though they use some parts that are close. The PS5 has 512mb of Ram which runs the OS. PS5 also has dedicated decompression hardware PCs don't have. Its my opinion as a lay person not a developer that it may be that bit that causes some issue. Rebar isn't needed on a console as they have a completely different method of texture decompression and loading. The console can load textures faster then a PC... I know sacrilege but its true at the moment. Its my understanding the PS5 has aprox 15gb of usable vram.

Over the next few years cards that can load enough texture into vram early are going to have advantage over cards that have to rely on rebar. Developers have always targeted consoles we all know that, right now a console has a superior vram setup vs any card with 12gb or less ram. Its just a sad fact thanks to Nvidia dicking around with vram pools to save a few bucks.
 
Last edited:
So you are saying they have had a patch in the last 9 days.... perhaps Hardware unboxed should update if that is the case. Did they patch the exact same issue in Hogwarts that was apparently in game 9 days ago ? As well as forespoken, calisto, and plague tale ?


You misunderstand how the PS5 works completely. I am not suggesting its more "powerful" It does however have more Vram then 8gb card, and more then a 12gb card as well. Consoles are not PCs even though they use some parts that are close. The PS5 has 512mb of Ram which runs the OS. PS5 also has dedicated decompression hardware PCs don't have. Its my opinion as a lay person not a developer that it may be that bit that causes some issue. Rebar isn't needed on a console as they have a completely different method of texture decompression and loading. The console can load textures faster then a PC... I know sacrilege but its true at the moment. Its my understanding the PS5 has aprox 15gb of usable vram.

Over the next few years cards that can load enough texture into vram early are going to have advantage over cards that have to rely on rebar. Developers have always targeted consoles we all know that, right now a console has a superior vram setup vs any card with 12gb or less ram. Its just a sad fact thanks to Nvidia dicking around with vram pools to save a few bucks.

As somebody with a PS5 developer kit I can tell you right now that you end up with at best a 4/12 split usually 6/10 though.

512 runs the OS, but you need to run the game. Even a PS4 title ends up being 4/4.

The console has absolutely insane load speeds for textures it is sad how much faster it is than a PC for loading textures. And since the console uses pooled memory the CPU and GPU don’t need to play the load to ram and then pull from ram to vram, game. This is where the Agility SDK comes in, but suffice to say the single memory pool cuts back on a lot of latency and simplifies the process of loading textures. (Google the Kraken Decoder and Oodle Texture for details on how the PS5 deals with Textures and decompression, it's fucked up how fast that is).

Lots of the console titles to date aren’t handling this aspect correctly on their PC port, willful negligence, incompetence, or budget issues I can’t say.

But for most of the games you are talking about here they play on a Steam Deck, and look good in the process.

It should also be noted that the texture popping thing that people relate with low vram also plagues the 7900 cards when playing in Proton and there it was related to system ram, 16GB there is not enough and you need far closer to 32 minimum to correct it. The Linux forums equate all these to severe memory leaks on both the GPU and System side, which points back to the previously mentioned porting problems.

That said 10gb is the minimum acceptable vram amount for 1440p and up and it will only grow from there, you can get away with less if drivers and developers are very on the ball which recent examples show that at least one of them has fallen off.
 
Last edited:

Would be great if true, but nothing shows an immediate price cut. The article mentions specific models that were $600 still being $600 as a "price drop". Not exactly a big savings. MSI Ventus OC and ASUS Dual OC were $600 day 1.

If the ASUS Dual and similar do end up $550 that would be great, but the article is saying we have to wait until May and that might not even reflect actual retail prices. I suppose it depends on what stipulations Nvidia adds to the rebate. Maybe Nvidia will force them to lower some models to $550 in order to receive them? If they don't, I assume the AIBs will pocket some of the difference and lower prices to $570-580.
 
PS5 has 16Gb total ram, a pc with a 3080 has 8Gb higher speed vRam, and 16 to 64Gb ram. It's MORE POWERFUL with MORE RAM, But your dev can't make it work because reasons?? Why are you making excuses for them.
don't forget PS5 is running games on low settings with no or highly gimped ray tracing that can run on 2 gen old amd gpu's (so i'm guessing none at all)

edit: and most likely using FSR / upscaling to hit HD resolutions
 
pair that with this.
https://www.digitimes.com/news/a20230420PD211/tsmc.html (paywalled)
https://www.reuters.com/technology/tsmc-q1-profit-rises-2-yy-beats-market-expectations-2023-04-20/

TLDR:
TSMC revenue is down and their prices are up.
Apple, AMD, and Nvidia have cut their orders back significantly
Apple paying through the nose for N3 is cited as shielding TSMC from the "broader market downturn"


OOf bad time to be selling silicon that's for sure.
Not a great time to be needing to buy it either.
 
Last edited:


Fun little video to watch explaining why the 4070 isn’t actually overpriced just because.

He's wrong on the TSMC wafer costs, N5 was raised to just short of $17,000 but Nvidia is using a custom N4 node so they are paying more than the base cost for the N5 that is for sure but not as much as the $21,000 they are charging for N3.
TSMC also let their customers know there is going to be another price increase coming in the second half of 2023 by around 6% to cover inflation. So come Oct that same $17,000 wafer is going to be $18,000, not the $16,000 he used in his math. So since 4 is between 5 and 3 lets split that pricing difference and say Nvidia's N4 node will cost them $20,000 come July.
Nvidia would be using that number when pricing the cards because they can't raise their pricing on their MSRP 4 months after launching the card.

So if we ever needed a reason for Intel to get back into the game and back on par with TSMC it's this right here, their wafer costs have more than doubled since 2016.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Looks like anyone who bought a 4070 RTX on day one will get screwed over with pricing. Boy this reminds me when Nvidia had to refund people $50 for each card back in the day. I cant remember which card it was (Geforce 3000/4000 series back in the day?)
 
Looks like anyone who bought a 4070 RTX on day one will get screwed over with pricing. Boy this reminds me when Nvidia had to refund people $50 for each card back in the day. I cant remember which card it was (Geforce 3000/4000 series back in the day?)
We got money back on the 280s.
 
PS5 has 16Gb total ram, a pc with a 3080 has 8Gb higher speed vRam, and 16 to 64Gb ram. It's MORE POWERFUL with MORE RAM, But your dev can't make it work because reasons?? Why are you making excuses for them.
They aren't making excuses you're just flat out wrong here. The memory is unified on a PS5. The memory on a PC is not. Going out to system memory on a PS5 you have 448GB/s of bandwidth to work with doesn't matter if it's VRAM or for the engine. To make the same trip to system memory on a 13900K you have 89.6GB/s of bandwidth. So when you fill up the 8GB of ram on your video card and it's got to go out to system memory its bandwidth gets reduced by a factor of 4 and you can't understand why this would pose a problem?
 
They aren't making excuses you're just flat out wrong here. The memory is unified on a PS5. The memory on a PC is not. Going out to system memory on a PS5 you have 448GB/s of bandwidth to work with doesn't matter if it's VRAM or for the engine. To make the same trip to system memory on a 13900K you have 89.6GB/s of bandwidth. So when you fill up the 8GB of ram on your video card and it's got to go out to system memory its bandwidth gets reduced by a factor of 4 and you can't understand why this would pose a problem?
The PS5 and many but not all of its games use the Kraken compression algorithms, which are lossless, with high compression ratios and "super fast" decode speeds, those tools were developed, and licensed to Sony, by Rad Game Tools, who were purchased by Epic in 2021. But those algorithms and formats allow the PS5 to decompress the textures on the fly from the NVME storage, so while it may only have a 5.5 GB/s raw transfer speed when dealing with loading textures it is closer to 17GB/s.

Nvidia has RTX IO, and AMD has Smart Access Storage, but even those top out around 7GB/s when dealing with a PCIe 4 NVME drive so less than half of the PS5's best abilities.
The PS5's ability to move data into memory and clear it out is just insane. It is honestly why porting to a PC from it is as difficult as it is because it is one of the few times where the best gaming PCs just can't keep up
 
They aren't making excuses you're just flat out wrong here. The memory is unified on a PS5. The memory on a PC is not. Going out to system memory on a PS5 you have 448GB/s of bandwidth to work with doesn't matter if it's VRAM or for the engine. To make the same trip to system memory on a 13900K you have 89.6GB/s of bandwidth. So when you fill up the 8GB of ram on your video card and it's got to go out to system memory its bandwidth gets reduced by a factor of 4 and you can't understand why this would pose a problem?
It's only a problem if you use more than the 8GB of VRAM on your card. If unified memory was an advantage, PC would be using it. It's a cost cutting method, and one that hurts overall performance. Again, the CPU and the GPU have different memory requirements beyond bandwidth. The CPU favors latency, where the GPU favors more bandwidth, but because more bandwidth usually results in higher latency hence why we have DDR and GDDR. This is why the PS5 performs more like a GTX 1060 to 1070 Ti at best than anything modern. As a reminder the PS5 has a total of 16GB of memory while your GPU has half that much just for graphics.

Nvidia has RTX IO, and AMD has Smart Access Storage, but even those top out around 7GB/s when dealing with a PCIe 4 NVME drive so less than half of the PS5's best abilities.
The PS5's ability to move data into memory and clear it out is just insane. It is honestly why porting to a PC from it is as difficult as it is because it is one of the few times where the best gaming PCs just can't keep up
Lies, as this game designer will tell you that these PS5 games can be done even on a PS3, let alone a PC. The whole PS5 NVME speed advantage is hilariously overblown.
 
Looks like they are giving away the Steam cards for all 40 series, not just the 4070.

Screenshot_20230422-105714.png
 
Last edited:
All the talk about ever increasing Vram needs for some current and likely next years games has a lot to do with it also. If the 4070 was the same GPU with 16gb of Vram they would fly off the shelf at 599. Personally that's where I'm at. I have a 16g card now and the only upgrade I would consider is one with 16g or more. So 7900xt , xtx, 4080, and 4090 - which by most people's opinion are also too expensive. 12g cards have to be $500 or less anymore for people to bite IMO. Next gen 12 will be starters like the 3060 now, 16 will be mid and 20+ will be high end I'm thinking. In fact the price drops on all the AMD 16g cards are the right moves at the right time. Nvidia has no compelling cards with lots of Vram at compelling price points right now. The 4070 is a 450-500 card and not a dime more.
 
WOW it's been a long time since I have seen Nvidia desperate to sell video cards......It's funny because the older generation 6950XT is a way better deal than the 4070.
It's a Microcenter promotion, which virtually no one has one near them, and is an independent retailer, not Nvidia doing the promotion. They're giving away Ddr5 with amd cpus too as their own promo, that doesn't mean those aren't selling well either.
 
and is an independent retailer, not Nvidia doing the promotion.
This is true. However it's the first time I've seen MC put a big promo out on Nvidia cards, especially ones so new. Just like the AM5 parts, it shows that they're not moving off shelfs nearly fast enough. This is a larger promo than even the ones on older 6950xts (currently at $50 off with purchase of cpu).
 
This is true. However it's the first time I've seen MC put a big promo out on Nvidia cards, especially ones so new. Just like the AM5 parts, it shows that they're not moving off shelfs nearly fast enough. This is a larger promo than even the ones on older 6950xts (currently at $50 off with purchase of cpu).
Pre pandemic they weren’t uncommon. It’s just been a few years.
 
Pre pandemic they weren’t uncommon. It’s just been a few years.
I have no way to verify what sort of deals they had back then, but I don't remember any big "only nvidia" ones. Doesn't change the fact that if they were selling well then they wouldn't be giving you $100 in steam credit on top of it. They're following the AM5 pattern, realizing that a lot of people aren't willing to pay full price for the cards. It really just echoes the same sentiment seen here and elsewhere: "it should be $100+ less"

People can cope about how "Oh inflation" and "oh, value vs last gen flagship", but that's just how it is. Consumers aren't liking the price tag and MC is trying to do something about it. Same complaints with going full AM5 when it launched.

*Edit* Makes me doubly glad to have a MC nearby. They actually have to address low sales and drum up customer enthusiasm. Nvidia will just try to wait you out.
 
I have no way to verify what sort of deals they had back then, but I don't remember any big "only nvidia" ones. Doesn't change the fact that if they were selling well then they wouldn't be giving you $100 in steam credit on top of it. They're following the AM5 pattern, realizing that a lot of people aren't willing to pay full price for the cards. It really just echoes the same sentiment seen here and elsewhere: "it should be $100+ less"

People can cope about how "Oh inflation" and "oh, value vs last gen flagship", but that's just how it is. Consumers aren't liking the price tag and MC is trying to do something about it. Same complaints with going full AM5 when it launched.

*Edit* Makes me doubly glad to have a MC nearby. They actually have to address low sales and drum up customer enthusiasm. Nvidia will just try to wait you out.
Absolutely, nothing is selling well right now.
If it was AMD, NVidia, and Apple wouldn’t have cut back their TSMC orders or stopped them all together.
Big stores like Microcenter do not have the luxury to sit on these because their value will only get worse over time.
 
It's a Microcenter promotion, which virtually no one has one near them, and is an independent retailer, not Nvidia doing the promotion. They're giving away Ddr5 with amd cpus too as their own promo, that doesn't mean those aren't selling well either.

Microcenter always has good deals but they are in store only. Too bad the closest one is an 8 hour drive from me. It can be selling poorly, but this isn't exactly unheard of for Microcenter.

If Best Buy, Newegg, Amazon, etc. match this deal with something similar that would be great.
 
Back
Top