NVIDIA GeForce RTX 4070 Priced at $600

True we are avid league bowlers and they just opened a Main Event here, guess what bowling cost? 80 USD, insane my son said it was cheaper to get a hour of arcade gaming. Cost of my 3080 purchase was1100/365days is 3 dollars a day.
It's almost like everything has gone up in cost, not just video cards made by Nvidia, like some are acting... :eek:.

I paid $779 for my evga 3080 ftw3 ultra hybrid (aio) plus tax. I've had it since Dec 2020, will upgrade next probably with the 5xxx series and just turn down settings slightly to compensate in the meantime. I have gamed at 4k since 2014.

You all see the tests of the new last of us patch? Vram usage in that game has gone way down now even at full ultra.
 
Didn't catch it... What was it? Maybe he plans to sell his card and pretend it wasn't mined on?

Said it was profitable mining again and snatch up those $300 3060TIs while you can (that's why he deleted it probably lol - sorry buddy 😜)

I went to look it up after he said that and saw this



Edit: Listen, just let me get my FE 4070 first is all I'm asking ✋
 
It's almost like everything has gone up in cost, not just video cards made by Nvidia, like some are acting... :eek:.
If everything went up as much as video cards, I'd be paying $200 for a gallon of milk. Some people here will tell me that the milk quality has gone up, so in reality I'm pay more for better milk. The only thing as insane as GPU inflation is housing inflation. The reason Nvidia is raising prices is to create confidence with shareholders, because nobody likes to see prices going down. Meanwhile Nvidia is drumming up hype with AI, hoping that will offset the declining GPU markets sales.

To give you an idea of how things changed, the Steam Hardware Survey now shows the RTX 3060 as the #1 GPU. The RTX 2060 is #2 with the GTX 1060 #3. The GTX 1060 just won't die and has actually gained again. The reason for the surge in RTX 3060's on Steam is their pricing, which you can now pick one up for $350 on Amazon brand new. With it's 12GB of VRAM, it's a great GPU for modern gaming. That's pretty close to the pricing of the GTX 970 back when it was also the #1 GPU on Steam for a while. You can also pick these up on Ebay used for around $280. The RTX 4080 on Amazon is selling for $1,200, about MSRP. Even the scalpers won't touch that GPU. AMD's 6700 GPU's are also being found for the same price as a RTX 3060, both new and used.

At this point a RTX 4070 at $600 is just going to sit on the shelf next to the unsold 4080's. No matter how much inflation has occurred, consumers are not interested in anything above $350. The RTX 4070 should be realistically around $400, and should come with more than 8GB of VRAM in 2023.
 
consumers are not interested in anything above $350.
They make literal line outside for them.

https://www.newegg.com/d/Best-Sellers/GPUs-Video-Graphics-Cards/s/ID-48
Look at the list of the top sellers how many are under $350 ?

https://pcpartpicker.com/products/video-card/
How many under $350 on pcpartpicker top picked card ? A minority of the top 50 one, 4070ti-4090 really high

https://store.steampowered.com/hwsurvey/videocard/
This seem to indicate millions of people buying well beyond $350 price tag gpu, tens and tens of millions ! Every 1% is above a million cards in the wild, almost all 3060 or higher card on that list was sold over $350 and this is counting that all the older card sold over that price on that list were bought by the same people that bought the more recent expensive card.

Most of those 3060 were sold above $350, even today most of the best selling 3060 models are aboce that mark it seem:
https://www.newegg.com/d/Best-Sellers/GPUs-Video-Graphics-Cards/s/ID-48?SrchInDesc=3060

I am confident a $600 4070 would sell much better than the $1200 4080, 3070 are still around that price on newegg right now, no 3070ti are that cheap, 3080-3080ti much more expensive if they are still sold, that would be an actual super improvement on that price point from nvidia offer no ?, unlike the 4080.
 
They make literal line outside for them.

https://www.newegg.com/d/Best-Sellers/GPUs-Video-Graphics-Cards/s/ID-48
Look at the list of the top sellers how many are under $350 ?

https://pcpartpicker.com/products/video-card/
How many under $350 on pcpartpicker top picked card ? A minority of the top 50 one, 4070ti-4090 really high
Did you know people buy stuff outside of NewEgg and PCPartPicker? Shocking I know.
https://store.steampowered.com/hwsurvey/videocard/
This seem to indicate millions of people buying well beyond $350 price tag gpu, tens and tens of millions ! Every 1% is above a million cards in the wild, almost all 3060 or higher card on that list was sold over $350 and this is counting that all the older card sold over that price on that list were bought by the same people that bought the more recent expensive card.
Going down the list.
NVIDIA GeForce RTX 3060
NVIDIA GeForce RTX 2060
NVIDIA GeForce GTX 1060
NVIDIA GeForce RTX 3070
NVIDIA GeForce RTX 3060 Ti
NVIDIA GeForce GTX 1650
NVIDIA GeForce GTX 1050 Ti
NVIDIA GeForce RTX 3060 Laptop
NVIDIA GeForce RTX 3080
NVIDIA GeForce GTX 1660
NVIDIA GeForce GTX 1660 SUPER

Those don't look like GPU's that are higher than $350, with the exception of the 3070. The 3080 is too far down the list to be a contender.
Most of those 3060 were sold above $350, even today most of the best selling 3060 models are aboce that mark it seem:
https://www.newegg.com/d/Best-Sellers/GPUs-Video-Graphics-Cards/s/ID-48?SrchInDesc=3060
Considering the recent surge on Steam's survey, I really doubt that. You may have to consider that they're used GPU's bought off Ebay or other sources.
I am confident a $600 4070 would sell much better than the $1200 4080, 3070 are still around that price on newegg right now, no 3070ti are that cheap, 3080-3080ti much more expensive if they are still sold, that would be an actual super improvement on that price point from nvidia offer no ?, unlike the 4080.
The 4070 should sell better, considering it's half the price of the 4080. Better than nothing is not impressive. Also hardware unboxed just released a video on how 8GB will effect gaming today. Doesn't look good for the 4070.
 
https://videocardz.com/newz/nvidia-...ual-dlss-performance-without-frame-generation

NVIDIA-RTX4070-PERF-CLAIM-1.jpg


NVIDIA-RTX4070-PERF-CLAIM-2.jpg


Officially, the card is targeting gaming experience at 1440p with 100 FPS. What is important is that this number assumes ray tracing and DLSS3 are enabled. NVIDIA is now going big on using DLSS3 in their marketing. There is hardly any mention of performance claims without DLSS or raytracing.
 
Those don't look like GPU's that are higher than $350, with the exception of the 3070.
I just show you a list of the most popular 3060 sku that even today goes above it today let alone the averarge weeks of its existence and obviously the 3060TI has well (no model under $400 here: https://pcpartpicker.com/products/video-card/#sort=price&page=1), (not sure how relevant if they are resold or not on ebay, the initial customer was obviously interest for an over $350 video card)

In 2022 2060 KO were still going for around $350 with shipping: https://www.tomshardware.com/news/evga-rtx-2060-in-stock-msrp and that made "news"

We seem to be going into a bit of a revisionist history were people did not had to go on wait list for a $700 3080 before the mining craze started, if you want to say a majority of customer are going for the $250-$350 sure, all ? obviously not.


Did you know people buy stuff outside of NewEgg and PCPartPicker? Shocking I know.
Yes but they should be quite representative NewEgg is one of the biggest one I would assume:

https://us.amazon.com/Best-Sellers-Computer-Graphics-Cards/zgbs/pc/284822
On amazon how many models over $369 do we see in the top 20 sku ? 11 or 12


Also hardware unboxed just released a video on how 8GB will effect gaming today. Doesn't look good for the 4070.
I think the latest rumours were that it would be 12 gig (the small 192 bit bus a la 3060 kind of forcing 6 or 12 I imagine), I would even predict that a $399.99 12 gig 4070 would be sold out without quite the significant volume has a mental exercise.
 
Also hardware unboxed just released a video on how 8GB will effect gaming today. Doesn't look good for the 4070.

I just saw this. They answered the question that I had about how nVidia was fixing the stuttering. Seems like the way that they do it is that the texture either isn't loaded or an extremely lower quality texture is loaded instead.
 
Looks like a 3080 12 gb with DLSS 3. Pretty stagnant but still better than their other 4000 SKUs in terms of price / perf. It'll probably sell okay in comparison to the other cards if they actually stay near MSRP.
If most sku release near msrp it would be a market reaction to the new reality, the 3070 launched with a $580 2023 dollar msrp and I would imagine many were above the $600 mark.

Nvidia giving a %50 vram boost, +30 performance with a price that almost follow regular inflation instead of GPU inflation, could look like a regular bad to meh deal instead of the terrible deal.

Will see if it actually happens, only 15% lower than the Ti with 77% of the cores does not seem certain nor does the rumored $600 tag or that they will be near msrp (and not AIB going mostly for superbig overbuilt card for a regular RX 6800 like 250 watt gpu with an much higher price than that)
 
Last edited:
I just saw this. They answered the question that I had about how nVidia was fixing the stuttering. Seems like the way that they do it is that the texture either isn't loaded or an extremely lower quality texture is loaded instead.
Which begs the question as a consumer would you rather ultra occasionally with flipping back and forth between trash quality and ultra, or medium across the board with no switching.
 
If most sku release near msrp it would be a market reaction to the new reality, the 3070 launched with a $580 2023 dollar msrp and I would imagine many were above the $600 mark.

Nvidia giving a %50 vram boost, +30 performance with a price that almost follow regular inflation instead of GPU inflation, could look like a regular bad to meh deal instead of the terrible deal.

Will see if it actually happens, only 15% lower than the Ti with 77% of the cores does not seem certain nor does the rumored $600 tag or that they will be near msrp (and not AIB going mostly for superbig overbuilt card for a regular RX 6800 like 250 watt gpu with an much higher price than that)
I'm thinking AIB cards will be 700+ for a good one near launch. I bet they'll dip like 5-6 months down the line. Hopefully I'm wrong. Doesn't affect me personally, I'd just like to see some signs of positive momentum in the market.
 
I'm thinking AIB cards will be 700+ for a good one near launch.
According to the link it would be a 185 watt in average gaming card with a target max TPU of 200 watt how much separation can you get on a card that "easy" to run, the big one 700+ could be virtually silent I guess too.
 
Which begs the question as a consumer would you rather ultra occasionally with flipping back and forth between trash quality and ultra, or medium across the board with no switching.
Well the 6800 (non-xt) pretty much allowed for 60 FPS ultra + rt for the most part at 1080 across the board. So I don't think we need to accept less than that.
 
Nvidia is really banking on Frame Generation to sell the card... The straight raster performance is not much of an upgrade compared to the difference between the 2070 Super and 3070 Ti. FG is not available on all games and trying to sell a GPU on AI FG trickery seems like a really bad idea. Nvidia really doesn't have a leg to stand on to justify so large a price increase for less than 20 FPS increase with out FG. I cannot wait for AMD to fix FSR in terms of shimmering and other graphical issues.
 
Well the 6800 (non-xt) pretty much allowed for 60 FPS ultra + rt for the most part at 1080 across the board. So I don't think we need to accept less than that.
I was honest to god shocked when I saw the results of that and my first thought was "I thought RT absolutely killed AMD card frame rates" ok it kind of did at 4k, but it does that on Nvidia cards too.
 
  • Like
Reactions: kac77
like this
I was honest to god shocked when I saw the results of that and my first thought was "I thought RT absolutely killed AMD card frame rates" ok it kind of did at 4k, but it does that on Nvidia cards too.
Yeah idk what's going on with that. By all rights the 3070 should be better at RT. That 8gb, tho.....

It died quicker than I thought it was going to.
 
Which begs the question as a consumer would you rather ultra occasionally with flipping back and forth between trash quality and ultra, or medium across the board with no switching.
Tough to say, but I noticed in the recent Diablo IV beta during one of the cutscenes, the ground closest to the camera had texture pop-in with like 4 different levels of detail--when the animation started it was very blurry and low-res, and then over the next few seconds it refined itself 3 times.
 
Yeah idk what's going on with that. By all rights the 3070 should be better at RT. That 8gb, tho.....

It died quicker than I thought it was going to.
To be honest I didn't know RT needed that much ram until recently. It takes a good chunk apparently and can send 8GB cards directly into the crapper if you try to enable RT at ultra settings.
 
Nvidia is really banking on Frame Generation to sell the card... The straight raster performance is not much of an upgrade compared to the difference between the 2070 Super and 3070 Ti. FG is not available on all games and trying to sell a GPU on AI FG trickery seems like a really bad idea. Nvidia really doesn't have a leg to stand on to justify so large a price increase for less than 20 FPS increase with out FG. I cannot wait for AMD to fix FSR in terms of shimmering and other graphical issues.
Raster based on the leaks is on par with the 3080 with DLSS 2 but obviously better when using DLSS 3.

https://www.pcgamesn.com/nvidia/rtx-4070-leak-3080-performance-with-dlss-on

Not the worst, solid 1440p card.

Just behind the 7900xt in performance but should be decently cheaper.

Not bad given the current state of GPU pricing but still not “good”.
 
Raster based on the leaks is on par with the 3080 with DLSS 2 but obviously better when using DLSS 3.

https://www.pcgamesn.com/nvidia/rtx-4070-leak-3080-performance-with-dlss-on

Not the worst, solid 1440p card.

Just behind the 7900xt in performance but should be decently cheaper.

Not bad given the current state of GPU pricing but still not “good”.
Considering the 7900XT is on par with a 4070ti. The 4070 can't be "right behind" since it will be at least 15-20% slower than a 4070ti.....which is on par with a 7900XT.
 
Considering the 7900XT is on par with a 4070ti. The 4070 can't be "right behind" since it will be at least 15-20% slower than a 4070ti.....which is on par with a 7900XT.
Averaged out with no DLSS and no Ray Tracing sure… But enable any of them and the 4070TI takes a huge lead at 1440p which is how things are going to be ran as the overwhelming majority of users for the cards are just going to load the profile for their game from GForce Experience.
 
Averaged out with no DLSS and no Ray Tracing sure… But enable any of them and the 4070TI takes a huge lead at 1440p which is how things are going to be ran as the overwhelming majority of users for the cards are just going to load the profile for their game from GForce Experience.
I would only buy a 7900 XT for 4k, it's a light 4K IMO not a good 1440p card. 1440 the best card is 4070 ti, they're both overpriced though IMO.
 
Averaged out with no DLSS and no Ray Tracing sure… But enable any of them and the 4070TI takes a huge lead at 1440p which is how things are going to be ran as the overwhelming majority of users for the cards are just going to load the profile for their game from GForce Experience.

I am pretty sure most do not play with DLSS and ray Tracing turned on, I have a 6900XT and I dont turn on FSR unless I want to play with Ray Tracing on at 1440P. Unless I notice a huge difference in quality then Ray Tracing remains off with FSR being turned off as well. Most people will only use DLSS or FSR if their frame rate is in the toilet and that is usually due to Ray Tracing.
 
I am pretty sure most do not play with DLSS and ray Tracing turned on, I have a 6900XT and I dont turn on FSR unless I want to play with Ray Tracing on at 1440P. Unless I notice a huge difference in quality then Ray Tracing remains off with FSR being turned off as well. Most people will only use DLSS or FSR if their frame rate is in the toilet and that is usually due to Ray Tracing.

Nah most turn it on by default (upscaling, if available) to get the most frames/performance right out the gate - then add in if also because of ray tracing or not
 
Averaged out with no DLSS and no Ray Tracing sure… But enable any of them and the 4070TI takes a huge lead at 1440p which is how things are going to be ran as the overwhelming majority of users for the cards are just going to load the profile for their game from GForce Experience.
Take away DLSS and Ray tracing (Which the 12GB buffer is going to limit anyway). It will be as fast as a 3080. For $600 its a bad look. SPECIALLY since you can get a 6950XT for $600.

At this point anyone who buys the 4070 is just getting fleeced by Nvidia and will have to upgrade in a year or 2 anyway. Not everyone wants to use DLSS 3 since it adds input latency and generally looks bad. Now DLSS2 at quality is fantastic. But it doesn't magically make a GPU faster in raw performance. Which a lot of people care about more.
 
I am pretty sure most do not play with DLSS and ray Tracing turned on, I have a 6900XT and I dont turn on FSR unless I want to play with Ray Tracing on at 1440P. Unless I notice a huge difference in quality then Ray Tracing remains off with FSR being turned off as well. Most people will only use DLSS or FSR if their frame rate is in the toilet and that is usually due to Ray Tracing.
If you load the games settings from the AMD or Nvidia applications, and you just tell it to give the optimal settings for the game for your system it will have either DLSS or FSR on 100% of the time if the game has it available. Very few PC gamers actually manually tune their games settings, the bulk just use Adrenalin or GfE, and those will default it to on.
 
If you load the games settings from the AMD or Nvidia applications, and you just tell it to give the optimal settings for the game for your system it will have either DLSS or FSR on 100% of the time if the game has it available. Very few PC gamers actually manually tune their games settings, the bulk just use Adrenalin or GfE, and those will default it to on.
I've never installed GfE or used Adrenalin for game settings. Can't say I'm normal, though.
 
Take away DLSS and Ray tracing (Which the 12GB buffer is going to limit anyway). It will be as fast as a 3080. For $600 its a bad look. SPECIALLY since you can get a 6950XT.

At this point anyone who buys the 4070 is just getting fleeced by Nvidia and will have to upgrade in a year or 2 anyway. Not everyone wants to use DLSS 3 since it adds input latency and generally looks bad. Now DLSS2 at quality is fantastic. But it doesn't magically make a GPU faster in raw performance. Which a lot of people care about more.
Specifically AMD and Nvidia want you to buy their old stuff, I assure you that the AIBs are only taking the amount of new silicon be it AMD or Nvidia that they are required to at this point, they all have so much of last years stock to move that they couldn't care less if either team be it Red or Green had a new product to sell.
Until Nvidia can find a way to get the mass of "new never mined on cards" off the market they have a significant problem, unauthorized repair facilities took on the mining cards in bulk, refurbished them, and are selling them off in a way that anybody with legit new cards has a hard time competing with all over South America, Asia, and Europe.
If it weren't for the takeoff of AI right now Nvidia would be looking at a financial disaster, and they know that wont last which is why they are working with the CCP to crack down on the refurbished cards and get them off the streets.
 
I've never installed GfE or used Adrenalin for game settings. Can't say I'm normal, though.
Considering BFE actually has a performance hit the last time I read up on it, I never wanted to use it.
 
Just by being on this site, you are part of the 1%.
Because the other 99% don't care enough to read this much, they just want to do what they want with as little resistance as possible.
Actually I think quite a few people do care. There are a lot of people who dont want to make an account to use it.
 
Take away DLSS and Ray tracing (Which the 12GB buffer is going to limit anyway). It will be as fast as a 3080. For $600 its a bad look. SPECIALLY since you can get a 6950XT.
If you have a 6950xt capable psu and don't mind the 400 watt, I am not sure if many/any Nvidia option that will look particularly good versus a 6950xt at $630 performance wise (would the 7900xt itself look good here ?), Nvidia current offer and 7900xt is probably more what they are looking at (those rdna 2 deals I can imagine are of their last legs, the giant extra inventory of rdna 2 fixing itself overtime).

Outside DLSS-RT there is hardware AV1 I guess, maybe on some models form factor if it matters.
 
If you load the games settings from the AMD or Nvidia applications, and you just tell it to give the optimal settings for the game for your system it will have either DLSS or FSR on 100% of the time if the game has it available. Very few PC gamers actually manually tune their games settings, the bulk just use Adrenalin or GfE, and those will default it to on.

I never installed the GfE and kept things defaulted off unless application turns it on. Hogwarts legacy was the first game that defaulted FSR on that I ever ran into so far. Otherwise FSR and Ray tracing all have to be turned on. Most gamers that I know pretty much do the same thing, turn on stuff in the game menu, not with the graphic card software. I am sure there are some that do what your saying, just not sure it's the bulk.
 
I used to install GFE for GameStream only

But now with GameStream gone I ended up keeping it installed because I got used to:

1) Nvidia's official Afterburner Auto OC alternative - figure better to use Nvidia SW with Nvidia chips for that

2) HDR screenshot capability

3) Whatever the performance monitor of theirs is called that's bundled in seeing as I'm using it for stuff above as well

4) Driver notification/updates seeing as using for all above as well

I don't use it for game settings or anything. Toggling features on/off and benchmarking/fine tuning settings I do in more games than actually playing the damn games more often than not LMFAO
 
If you load the games settings from the AMD or Nvidia applications, and you just tell it to give the optimal settings for the game for your system it will have either DLSS or FSR on 100% of the time if the game has it available. Very few PC gamers actually manually tune their games settings, the bulk just use Adrenalin or GfE, and those will default it to on.
People use AMD or NVIDIA applications to to tune game settings? I've never done that. I've always used the settings section in the games to do any tuning.
 
Back
Top