NVIDIA GeForce RTX 4070 Reviews

You can play almost any game if your willing to turn everything down to low sure. :)
I hope people buying a card that is selling for north of a grand where I live... are going to be ok with having to turn games released at the same time as their new GPU down to medium settings.
I don't know perhaps 12gb will be enough to handle Medium settings in games coming out later this year and next. Its a gamble though.
8GB will be fine for ultra at 1440p and below until some time in 2027, that's when we can expect to see the next generation of engines and feature sets become an actual thing.
Most of the huge VRAM memory usage that we have been seeing lately on launch is because of the vast differences in how the consoles and the PC manage loading assets into memory and how the memory is managed.
Microsofts Agility SDK will help normalize this by taking that job away from the developers and emulating the console memory heaps on the PC.
But usage is mostly a bad porting problem currently, and issues with direct access, or lack thereof, which cause many titles to panic load assets resulting in memory leaks, over usage, and blah blah blah, it's a mess, mostly caused by the reliance on some automated tool sets in the Dev-Kits.
 
Hogwarts Legacy. Calisto Project, Last of us Part one, The new resident evil (which hardware unboxed just showed a 8gb 3070 crashing on at 1080p), The new Plague tale game. No doubt there are more... and many more on the way. Everyone of those will use well over 12gb of ram at 1080p ultra and 1440 high.

I would not go as far as to call shenanigan's on all the major reviewers today not including games like Hogwarts which is one of the most popular games going at the moment... I do have to wonder why no one has covered any of this years big eye candy titles. I seem to remember a time when reviewers didn't ONLY look at 2+ year old games in a review for a brand new GPU. Sure perhaps they mention and * hey we used patch X.X.... but to ignore all games released in the last two years seems odd to me.
It's well known that games will reserve more VRAM than they need. My 11GB 2080ti runs ultra settings 3440x1440 in Hogwarts Legacy and has no issues. I haven't played the others so can't comment on them. But RE crashing at 1080p while using 8GB is obviously some issue with the port or driver.
 
8GB will be fine for ultra at 1440p and below until some time in 2027, that's when we can expect to see the next generation of engines and feature sets become an actual thing.
Most of the huge VRAM memory usage that we have been seeing lately on launch is because of the vast differences in how the consoles and the PC manage loading assets into memory and how the memory is managed.
Microsofts Agility SDK will help normalize this by taking that job away from the developers and emulating the console memory heaps on the PC.
But usage is mostly a bad porting problem currently, and issues with direct access, or lack thereof, which cause many titles to panic load assets resulting in memory leaks, over usage, and blah blah blah, it's a mess, mostly caused by the reliance on some automated tool sets in the Dev-Kits.

A few things on that...
First no 1080p ultra and high 8gb is lacking right now. Saying its a emulation problem... or optimization issue is wishful thinking.
Second... even if we can say yes with optimization all these new games crashing at 1080p on 8gb cards can be solved. MOST will not be. Developers are all indicating they are over spending time and money trying to optimize for old PC hardware.

8GB cards will be able to play games for years sure... but not at high settings. Medium is already the only setting that at 3070 8gb won't crash on (or experience texture load shuddering) in multiple titles released in the last year.

All the buy a 3000 Nvidia card for better RT in future titles advice people where giving a year ago.... cards like the AMD 6800 are now performing much better then the 8gb Nvidia 3000 cards with RT enabled. 8GB was a mistake for the 3000s... and 12gb is almost for sure a mistake for the 4000s.
 
Yes it is I used to have one as well
There is the 750TI OC edition which only had 1GB of GDDR5 which is the one I have seen used as a look at what I have made this do, I honestly couldn't remember if that was the norm or not, I guess not.
https://www.gpuzoo.com/GPU-MSI/GeFo...e GTX 750 Ti,and requires 2 motherboard slots.
This is a better video again doing it at 1GB than I initially saw, he turns it down to 720p but uses the Ultra 4K textures then uses FSR to rescale them and it doesn't look like complete ass, funny shit.

 
Looking at the reviews, the 4070Ti would have been an excellent $600 4070 in the current environment....(it would have like the 3070 was competing with the 2080TI quite well, the 4070ti compete with the 3090 well enough usually a bit ahead with exception like the 3070), always one tier behind the good offer, this would have been an excellent 4060 card, if right priced.
 
Last edited:
Looking at the reviews, the 4070Ti would have been an excellent $600 4070 in the current environment.... always one tier behind the good offer.
As intended. Well until you find yourself opening your wallet for the only "good" value, the 4090.
 
Looking at the reviews, the 4070Ti would have been an excellent $600 4070 in the current environment....(it would have like the 3070 was competing with the 2080TI, competed with the 3090 well enough), always one tier behind the good offer, this would have been an excellent 4060 card, if right priced.
Well from what we understand the 4070 was supposed to be a 4060. Just Nvidia calling it the 4070 and marking up the price.

Gotta say, Nvidia might be 1 shady ass company when it comes to products. But, just like Apple....they have a following and people will blindly believe what they have to sell. Great business strategy that is working for Nvidia. Now, not saying AMD is any better in regards to making money. Its just Nvidia is better at manipulating their customers.
 
All the folks getting militant over this. Just lulz. If you want Nvidia so bad, just do it. It's your cash. But 12GB, just like the 10GB on my 3080 isn't already, will soon not be enough, very soon.

What about the folks getting so mad that others are just buying a Nvidia 4070? 🤔

To them, I say:

oprah-winfrey-you-mad.gif
 
will soon not be enough, very soon.
Maybe, but game will need to run well on 10 gig or so vram Xbox-ps5 consoles (and on most GPU that will have 10 gig or less in the field), there a shift going from ps4 to ps5 only title and vram usage that we see right now, but like it stagnated a lot until then it could start back to stagnate until a ps6 release.

Could depend on how that texture fast read and decompression play out, does it make only loading time faster or it get good enough to do it live during the game.

It is a risky proposition (you will have issue the first month with console port for sure and it will exist scenario for which it will be an issue has well, would reasonable setting-performance always be an option at 10 gig of vram until the PS6, I would say it should be likely)
 
Maybe, but game will need to run well on 10 gig or so vram Xbox-ps5 consoles (and on most GPU that will have 10 gig or less in the field), there a shift going from ps4 to ps5 only title and vram usage that we see right now, but like it stagnated a lot until then it could start back to stagnate until a ps6 release.
Plenty games already doing that with more on the way.

This video highlighted the beginning:
 
Plenty games already doing that with more on the way.
Would be clearer if we had a 10-12 vs 16 and on similar architecture and mature fixed games. A lot of those game need to run on the xbox or ps5, are they really a big issue to run on a 3080 ?

Cost2-p.jpg


You are obviously supposed to beat the 4070ti-4080, etc.. has you go down the stack (and in a more normal world that would include AMD offer), but about a $100 rebate promo 7900xt by dollar on pure raster performance by dollar, not "bad" for Nvidia, not particularly good because it does not pressure much interesting down, just make irrelevant the 3070ti/3080ti and so on that should not have been anyway.
 
Going to check other websites, but this doesn't look promising if you are coming from a 3070.

relative-performance-2560-1440.png




Seems like it performance is not satisfactory, the 4070 would take it to still unsatisfactory. If performance was satisfactory, it the 4070 will still be satisfactory of course. But going from 60 frame rates to 78 wouldn't be that appealing especially for the price tag.


callofduty.png


Yikes. Can't even get 120 frame rates in MW2 without resorting to DLSS and having to experience the horrific pixelated noise issues and ghosting red dots which messes up your aiming. Not really a game where you want DLSS to be screwing up your image quality.


And for those obsessed with ray tracing performance. It went from unplayable to unplayable.

control-dlss-2560-1440.png



My main problem is AMD doesn't run DCS very well, otherwise I would consider AMD.


Will other brands have Display Port 2 support? I don't think my monitor has HDMI 2.1.
 
Last edited:
Well from what we understand the 4070 was supposed to be a 4060. Just Nvidia calling it the 4070 and marking up the price.

Gotta say, Nvidia might be 1 shady ass company when it comes to products. But, just like Apple....they have a following and people will blindly believe what they have to sell. Great business strategy that is working for Nvidia. Now, not saying AMD is any better in regards to making money. Its just Nvidia is better at manipulating their customers.
unlike that's the case.. it's more likely the other way around where the 4070 was suppose to be the 4070ti until they rebranded the 4080 12GB. they just power limited the card to ~200w and called it the 4070.
 
Would be clearer if we had a 10-12 vs 16 and on similar architecture and mature fixed games. A lot of those game need to run on the xbox or ps5, are they really a big issue to run on a 3080 ?



You are obviously supposed to beat the 4070ti-4080, etc.. has you go down the stack (and in a more normal world that would include AMD offer), but about a $100 rebate promo 7900xt by dollar on pure raster performance by dollar, not "bad" for Nvidia, not particularly good because it does not pressure much interesting down, just make irrelevant the 3070ti/3080ti and so on that should not have been anyway.

I've seen the gameplay footage during many of these reviews. 11GB usage, 12ish GB usage wasn't that uncommon. Already, and this card just came out. Just wait until next year.

Saying "needs" to play on a console isn't as simple as that. The PC ports, especially on the Sony side are now very well optimized to run on high end PC hardware. If it was as simple as this, then connecting a keyboard and mouse to a console would be enough. It usually isn't. The PC ports for many of these games are visually impressive from the upgraded textures, etc.
 
Another garbage tier release by nvidia. It's fascinating that all of their releases are good cards, the pricing and naming conventions just suck. This would've been great as the 4060 ti at 450 dollars. Both companies seem hell bent on maximizing pricing, I'm expecting the 7800 XT to slot in right between this and the 4070 ti and priced to match.
 
Ha RandomGaming got a 4070! If you don't know, he isn't known for top tier hardware, especially new hardware.
Paired with a 12400 @ 1440p & 2160p in 18 games.

 
Looking like a good decision that I snatched up an RX 6800 recently when on sale... (even though this is what.. over 2 year old series now lol)
16 GB GPU should have me set :ROFLMAO:







for a little bit anyway
 
Hogwarts, TLOU, etc. may be shoddy ports, but they are the new AAA games we're getting on the market. Performance in those games is definitely relevant to new vid card releases.
The 1.0.2.0 patch for TLOU severely cuts VRAM usage and while the game UI still gives you a warning message when you allocate more of it by using the Ultra settings the game seems to run fine even if you exceed actual VRAM usage by well over 100%.
Actual usage on both system and GPU have gone down a lot and gameplay is a lot smoother.

And yeah I know the consoles have the 16GB of combined memory and you only need to allocate a minimum of 2GB for the game/os, but that is the minimum required for functionality, system memory usage goes up from there and Vram goes down, most console titles end up allocating around 5GB for the game, 10 for GPU, with 1 for overlays screenshots, voice chat, and other features. Memory allocation there falls in line with the PS4 which usually ends up using around 4.5GB of its memory for the game to run with the remainder for the VRAM, I have a hard time believing that the PS5 won't add more features to games which require more allocation for system memory taking more away from GPU memory. But those allocations also have to work for the consoles at 4K, 1080p and 1440p usage should be significantly lower, unless the developers aren't actually bothering with lower resolution textures and models and instead just letting FSR rescale them on the fly which is what I suspect is happening that is a lot cheaper as you don't need the art department to do nearly as much. A problem Epic saw coming and their tools for Unreal 5 mostly fix by rescaling them during decompression which is part of their Texture Streaming pool so one asset but multiple possible outputs based on render resolution(which changes vram usage).

Nvidia should have put more VRAM on their cards to make life easier for developers and give them wiggle room they are running things too close and relying on drivers to do the work the development studios aren't doing because of budget cuts, and then making us wait for patches to fix issues that should have been nailed down before launch but PC gamers aren't a high priority as a buggy as shit PC launch is perfectly fine today but Sony and Microsoft will fine the crap out of you if they have to start issuing refunds because the game runs like crap on the console.
 
Last edited:
Who knows, I've seen people with older cards trying to use RT and suffering sub 30 fps performance for those "amazing" visuals.
To be fair that's also the fault of Nvidia for advertising that as a feature and I'd say a selling point over their competitor, so it's only reasonable they'd try to enable these things. Now playing a game for pretty while it drags ass then yeah that is on them for not turning it off, in the same way you'd turn down textures or AA settings
 
To be fair that's also the fault of Nvidia for advertising that as a feature and I'd say a selling point over their competitor, so it's only reasonable they'd try to enable these things. Now playing a game for pretty while it drags ass then yeah that is on them for not turning it off, in the same way you'd turn down textures or AA settings
Ray tracing has the inherent problem that you still need to render the scene in full raster first, Ray tracing is only really used to determine brightness, hue, shadows, and reflections, and even then not very well. It has the advantage that it is technically cheaper as you don't need to pay a team of art students to walk through an area and look at it from a bunch of angles manually adjusting things as they go, which gives more consistent results as coordinating people is hard, and if they miss things it can lead to huge FPS drops because of all sorts of texture, mesh, engine, blah blah blah issues. Ray Tracing avoids that by taking the human element out doing their work on the fly, but as Ray Tracing isn't ubiquitous, they still need to put in the work so you are paying for the art team and then having a computer duplicate their work for ray tracing. The problem is as more people get and use ray tracing the art teams get smaller, they have to do more with less because of budget cuts, and things get missed, leading to weird graphical issues and performance hangups.
Honestly for RT effects anything beyond the bare minimum at this stage detracts from the games and doesn't add to it, Unreal and Unity are leading the way for their new engines which can do a lot more with a lot less, but then Nvidia had to throw a wrench in that too by figuring out a Path Tracing problem that had remained unsolved since the early 90's, thank you AI??? In that, they managed to develop a unified algorithm that does the job where even ray tracing has something like 70 different possible algorithms to choose from depending on specific situations so that is gonna be the next fight...
I am too old for this and there are too many new things and it is scaring me, I am going to crawl into a hole and play some good old Mega Man 3 because I was at least good at that one.
 
I am too old for this and there are too many new things and it is scaring me, I am going to crawl into a hole and play some good old Mega Man 3 because I was at least good at that one.

But first go run it through Nvidia Remix to add ray tracing to it! 😁:p
 
But first go run it through Nvidia Remix to add ray tracing to it! 😁:p
That might actually be a fun exercise in futility, far more so than my existing one of trying to find out how to allow multiple instances of our accounting software to run on one machine so you can copy and paste data between the windows.
To even make the software work I have to run it a few VM/Emulators deep so I am experimenting with one I saw on a LTT video somebody posted here for retro gaming to see if I can get it working in PCEM instead of the existing setup so fingers crossed because that could make my life less difficult for the next 2 years as that is the earliest we expect the replacement to go live at this point... Thankyou COVID!, I mean that sincerely their development team made more progress during 6 months of lockdown than they had in the 3 years prior.
 
That might actually be a fun exercise in futility, far more so than my existing one of trying to find out how to allow multiple instances of our accounting software to run on one machine so you can copy and paste data between the windows.
To even make the software work I have to run it a few VM/Emulators deep so I am experimenting with one I saw on a LTT video somebody posted here for retro gaming to see if I can get it working in PCEM instead of the existing setup so fingers crossed because that could make my life less difficult for the next 2 years as that is the earliest we expect the replacement to go live at this point... Thankyou COVID!, I mean that sincerely their development team made more progress during 6 months of lockdown than they had in the 3 years prior.

I think you'd have to go through/perscribe every material manually/yourself everywhere in the game at one point cause Remix wasn't really made for 8bit games at all as far as I understand it

Have fun with that lol ✋
 
That's pretty ridiculous. You're basically saying that if you go nvidia it's 4080/4090 or don't bother.
Isn’t the biggest benefit of doing 4070 is getting access to DLSS3?

It pains me to have conversations about new games adopting DLSS3 like FH5 and the other party can’t enable it due to being on a 3080 or 2080 series. Moving to. 4070 gives them access to this technology without paying 1900 USD after taxes and shipping but still 3080 level performance
 
Isn’t the biggest benefit of doing 4070 is getting access to DLSS3?

It pains me to have conversations about new games adopting DLSS3 like FH5 and the other party can’t enable it due to being on a 3080 or 2080 series. Moving to. 4070 gives them access to this technology without paying 1900 USD after taxes and shipping but still 3080 level performance
DLSS3 is not a good selling point. It looks awful, it adds latency. It is inferior compared to DLSS2 which doesn't generate frames.

DLSS3 is a joke.
 
Back
Top