4070 gets its lunch money taken by 6800XT and costs more except in RT games

vjhawk

Limp Gawd
Joined
Sep 2, 2016
Messages
451
Benchmarks and source article here from PCWorld: https://www.pcworld.com/article/178...-3080-vs-radeon-rx-6800-xt-which-gpu-buy.html

Long story short, in rasterized games, 4070 loses to the 6800XT which can be found for under $550 on Newegg. Meanwhile it's priced at $599 which used to be the old xx80 series pricing.

Is Nvidia just out of touch or too greedy for their own good?

The good news is that the 4070 does manage to beat the 6800XT when ray tracing is enabled in those games that supports it. But that's about all the good news.

Is anyone seriously going to buy this card? It feels like the lame duck out of the Nvidia 40xx series offerings.

The 4090 is the top end card, so all the people who want top of market performance will gladly pay nosebleed prices. The price isn't even relevant to them.

The 4080 is the next best card, relatively bad value for the price but at $1200 is the next tier down for people who need to respect a budget but begrudgingly pay over the odds for an upgrade.

Meanwhile with the 4070, what is even the point of this card? It loses to cards that are cheaper than it in purely rasterized games, it doesn't offer top end performance, it honestly just feels like a video card for Nvidia to sell that couldn't reach 4080 specs. IE factory reject 4080's get 'cut down' and sold as 4070s. Thoughts?
 
Last edited:
Paid 315 for my 6800xt when people were selling off waiting for the 7 series. I realize the value of that deal now. Nvidia is nuts for asking what the are for the 4070.

Also sucks to be in this hobby long term at least from the cost perspective. My eyes are wide looking at the current nvidia line up and prices they are asking.
 
The 4070 and the 6800XT are roughly equivalent in performance. Neither is beating the other consistently to be posting such a bullshit title.

1080p.png
1440p.png
2160p.jpg

relative-performance-1920-1080.png
relative-performance-2560-1440.png
relative-performance-3840-2160.png
 
While there's "the price" (standard Nvidia problem these days), overall, I think they did pretty good. Small board, should be easy to cool and fit into "smaller" places.... I know people won't like me not favoring old Radeon cards, but I think Nvidia did "better" than I was expecting. YMMV of course.

What do you get? More modern board with more features. Way less power consumption (!), much cooler.

What is the problem? Still at least USD $100 than what it should be. And of course even at that, it wouldn't be a super deal (AMD life ending deal).

I think if RDNA >3 steps up the features and can figure out their power efficiencies, then they become an easier recommendation. Outside of the price difference (if you don't count your power bill), Nvidia actually does ok here.
 
The 4070 is looking like the fastest card in the 200-sub 200w range if you value performance per watt. Other than that it’s just decent. The price is interesting around 499 and a no brainer at 449 in today’s market IMO. However at 599 it’s a hard pass for me though. Decent card but overpriced.
 
Last edited:
Honestly, I'll probably get one if the price tilts a bit further down.
 
Honestly, I'll probably get one if the price tilts a bit further down.
You can get a $100 Steam Gift Card if you buy it at Microcenter in the US now. AIB partners are getting $50 rebates from Nvidia but who knows if they will pass on the savings to consumers and if the Microcenter promotion will still be going on if that happens. In the UK, the price is actually below $600 USD (I think $570ish USD) if you look at the pre-sales tax price.
 
Benchmarks and source article here from PCWorld: https://www.pcworld.com/article/178...-3080-vs-radeon-rx-6800-xt-which-gpu-buy.html

Long story short, in rasterized games, 4070 loses to the 6800XT which can be found for under $550 on Newegg. Meanwhile it's priced at $599 which used to be the old xx80 series pricing.

Is Nvidia just out of touch or too greedy for their own good?

The good news is that the 4070 does manage to beat the 6800XT when ray tracing is enabled in those games that supports it. But that's about all the good news.

Is anyone seriously going to buy this card? It feels like the lame duck out of the Nvidia 40xx series offerings.

The 4090 is the top end card, so all the people who want top of market performance will gladly pay nosebleed prices. The price isn't even relevant to them.

The 4080 is the next best card, relatively bad value for the price but at $1200 is the next tier down for people who need to respect a budget but begrudgingly pay over the odds for an upgrade.

Meanwhile with the 4070, what is even the point of this card? It loses to cards that are cheaper than it in purely rasterized games, it doesn't offer top end performance, it honestly just feels like a video card for Nvidia to sell that couldn't reach 4080 specs. IE factory reject 4080's get 'cut down' and sold as 4070s. Thoughts?
The thing is it's actually worse that that.

You may as well just buy a 6950XT for $600 and compare that directly to the 4070 which is also $600.
https://www.newegg.com/asrock-radeo...=6950xt-_-14-930-088-_-Product&quicklink=true
$629 with a $30 instant rebate. You can look around, there are other models and places you could look for $6950XT's with similar prices.

This is an even better demonstration as to why nVidia's current pricing model is out of touch.

According to ZeroBarrier 's charts, that a 13fps advantage in 4k across 13 games with lows also being increased by 11fps. That's a 20%~ speed advantage for the same price. It gives tangibly better performance in both 4k and 2.5k.

Even if you bought a different model of 6950xt for $50 more (or if it's not on sale or whatever), that 20% speed difference is arguably worth it. But really I would still say it's still a fair apple's to apples comparison because there are 4070's that also cost $650 (heck there are ones that cost $700), so it's not as if paying $650 for a 6950XT and comparing that to the 4070 in general is out of line.
 
The thing is it's actually worse that that.

You may as well just buy a 6950XT for $600 and compare that directly to the 4070 which is also $600.
https://www.newegg.com/asrock-radeo...=6950xt-_-14-930-088-_-Product&quicklink=true
$629 with a $30 instant rebate. You can look around, there are other models and places you could look for $6950XT's with similar prices.

This is an even better demonstration as to why nVidia's current pricing model is out of touch.

According to ZeroBarrier 's charts, that a 13fps advantage in 4k across 13 games with lows also being increased by 11fps. That's a 20%~ speed advantage for the same price. It gives tangibly better performance in both 4k and 2.5k.

Even if you bought a different model of 6950xt for $50 more (or if it's not on sale or whatever), that 20% speed difference is arguably worth it. But really I would still say it's still a fair apple's to apples comparison because there are 4070's that also cost $650 (heck there are ones that cost $700), so it's not as if paying $650 for a 6950XT and comparing that to the 4070 in general is out of line.
4070 to 6950XT looks more like a 10% difference. The 4070 is also quieter, cooler, smaller, and much more power efficient. Then add in DLSS 3, Ray Tracing about equal to a 7900XT, and the usual Nvidia extras and I'd take the 4070 over the 6950XT any day of the week.
 
Last edited:
4070 to 6950XT looks more like a 10% difference. The 4070 is also quieter, cooler, and much more power efficient. Then add in DLSS 3, Ray Tracing about equal to a 7900XT, and the usual Nvidia extras and I'd take the 4070 over the 6950XT any day of the week.
Buy whatever GPU you want and rationalize it however you want. Your performance numbers and GPU comparisons though aren’t even close.

The charts are in the thread. Everything I stated is right there.
 
Buy whatever GPU you want and rationalize it however you want. Your performance numbers and GPU comparisons though aren’t even close.

The charts are in the thread. Everything I stated is right there.
These charts?
1683736690450.png
1683736713099.png


6950XT is only about 5-6% faster than the 6900XT which makes the "not even close" 10% numbers out to be just as close as your 20% figure (except for TechSpot's 4k figures which are quite a ways off of TPU's). And with the 4070 you're not buying old tech; the 3080TI is about as fast as a 6950XT but I wouldn't recommend someone buy one of those for $600 (if it were available) now either; maybe $4-500 would make buying the older cards worthwhile...
 
Last edited:
These charts?
No... :rolleyes: theses ones.

View attachment 565496
View attachment 565497
6950XT is only about 5-6% faster than the 6900XT which makes the "not even close" 10% numbers out to be just as close as your 20% figure (except for TechSpot's 4k figures which are quite a ways off of TPU's).
First off I stated 4k and 2.5k. Then you quote a 1080p chart. Second you quote charts without the card not even on the graph when I stated that the chart I was referring to was literally in the thread. Then on top of that you just pulled performance numbers for the 6950XT out of your ass.

When the numbers don't agree with you, you suddenly disagree with them. But you're more than capable of rationalizing and ignoring an incredibly well known source that actually has the relevant data on its chart. You know, like having actually tested the relevant card.
And with the 4070 you're not buying old tech; the 3080TI is about as fast as a 6950XT but I wouldn't recommend someone buy one of those for $600 (if it were available) now either; maybe $4-500 would make buying the older cards worthwhile...
"Old tech" is a description that doesn't mean anything. The bottom line is performance. Would you buy a 4060 today because it's "new tech" in order to replace a 3090Ti? If you would, you're an idiot. Any person with sense would keep the more performative card. And it's no different cross brand.

Go ahead. Keep rationalizing. Buy whatever you want. Lie to yourself if it's convenient. But since I made that post the 6950XT is even cheaper than the 4070 and it's much faster.
 
Last edited:
These charts?
View attachment 569448View attachment 569449

6950XT is only about 5-6% faster than the 6900XT which makes the "not even close" 10% numbers out to be just as close as your 20% figure (except for TechSpot's 4k figures which are quite a ways off of TPU's). And with the 4070 you're not buying old tech; the 3080TI is about as fast as a 6950XT but I wouldn't recommend someone buy one of those for $600 (if it were available) now either; maybe $4-500 would make buying the older cards worthwhile...
Are you gaming at 1440p? RT is nice but you take a fps hit, right? - so, it's not useful at 4K for that card? DLSS 3 - sure, it can be an incentive to pick the nvidia card. Anyway, carry on.... :)
 
No... theses ones. :rolleyes:



First off I stated 4k and 2.5k. Then you quote a 1080p chart. Second you quote charts without the card not even on the graph when i state that the chart is literally in the thread.

Go ahead. Keep rationalizing. Buy whatever you want. Lie to yourself if it's convenient. But since I made that post the 6950XT is even cheaper than the 4070 and it's much faster.
I included the 1080P AND 1440P charts as someone in the market for a 4k card likely isn't going to want a 4070 or a last-gen card. I didn't quote the TechSpot charts since I didn't feel like calculating the percentages :ROFLMAO: . 10-15% faster at pure raster, slower at RT, doesn't support DLSS / FSR3 + FG, and is old while using 63% more power. Joy.

Power3.png


From TechSpot's conclusion:
Going back to buying recommendations, the RTX 4070 is a solid offering but there are a few alternatives you may want to consider. The Radeon 6800 XT can be had for around $570 and it offers more VRAM, but you'll have to weigh that up against the lack of DLSS support and inferior ray tracing performance - we'd probably still go with the RTX 4070. The Radeon 6950 XT could make for a stronger case with on average ~15% more performance for $650, but the same pros and cons apply here.
 
Last edited:
Am I the only one here who does not give a damn about power consumption?
You’re not. There isn’t anything more to say here.

The only people that are talking about buying 4070’s are nVidia fanboys, people in countries where 6950XT’s aren’t cheap, and people in countries that have electricity that’s more expensive than CA (which is very few).

And even in those cases the performance per watt and break even isn’t that simple. Because people aren’t gaming 24/7 on their hardware. Most of the time it’s browsing and idle wattage. If you’re a streamer or professional gamer and play for 8-10 hours a day with 100% GPU utilization it might end up being $50 after an entire year.

That isn’t a big enough deal unless you’re planning to have a few hundred to mine on, do data sets on, or a render farm. However even in those cases you could simply put a power limit on the AMD card, still come out ahead per watt and performance wise and save money on the card.

Everyone is buying AMD or not buying at all. The stacks of 4070s sitting in every store proves it.

If you want to pay more for less, that’s on you. But it’s a pretty obvious shill that comes in and tries to say that a 4070 gives better performance than a 6950XT and then after being proved wrong brings up performance per watt.
 
Many people looking for the fastest at any price won’t care about price or power consumption. For me it’s the performance target, then the price , then other things like power requirements, features , physical size etc.

The 4070 is pretty much an iteration of the 12gb 3080 with its efficiency gains and access to frame gen. The 599 price sucks so it is a slow seller right now. As I stated earlier in the thread I like the card but not the ask. I wouldn’t go over 450 for a 12G card in 2023 personally.
 
Many people looking for the fastest at any price won’t care about price or power consumption. For me it’s the performance target, then the price , then other things like power requirements, features , physical size etc.

The 4070 is pretty much an iteration of the 12gb 3080 with its efficiency gains and access to frame gen. The 599 price sucks so it is a slow seller right now. As I stated earlier in the thread I like the card but not the ask. I wouldn’t go over 450 for a 12G card in 2023 personally.
Everything ultimately comes down to dollar to performance. Because if the performance target is absurdly priced (whatever it is, the joke is the 5090 will be $6000 will people still buy that if it’s 10% of the average annual income? Clearly most people can not and will not) it’s not reasonably attainable anyway.

If the 4070 was $400, I would recommend it basically to everyone. But that isn’t the cost that it is, so it’s performance must be measured against every other card that costs $600.

This is more or less obvious to everyone in the market right now. Hence why the 4070 isn’t selling. But shilling for this dumb card when it’s such a terrible value makes zero sense.

Criticizing nVidia even if you’re a fan is a better place to be than blindly stating “they’re the way to go”.
 
Last edited:
  • Like
Reactions: noko
like this
Am I the only one here who does not give a damn about power consumption?
Well there are people that buy lifted trucks and massive SUVs to drive theirself to the grocery store so no. ¯\_(ツ)_/¯

But lower power consumption on a GPU = cooler and quieter system with less heat being dumped into my immediate vicinity.
 
Last edited:
Am I the only one here who does not give a damn about power consumption?
When talking about both end of the spectrum like a 3090ti-6950xt versus something as potentially quite easier to fit in case and power up with my current PSU like a 4070, it would start to be a factor to me, not so much the saving of kwh and money, but wondering if my PSU is enough, my case big enough and what the noise would be.

Does not apply to 6800xt, but a 3090-3090ti-6900xt-6950xt spike consumption:
power-spikes.png


Would make me think.
 
Last edited:
If you want to pay more for less, that’s on you. But it’s a pretty obvious shill that comes in and tries to say that a 4070 gives better performance than a 6950XT and then after being proved wrong brings up performance per watt.
Who said the 4070 gives better performance than the 6950XT? Guess there are no obvious shills to match your case. I even said I wouldn't recommend a faster last-gen NV card at the same price...

None of this changes the fact that the 6950XT is an old card and based on a radically different architecture than AMDs current cards so will be quickly subjected to diminishing returns and loss of support from AMD. You're buying a bit extra performance and left with a card that's going to be worse in every other way. To say otherwise would likely have to be an obvious AMD shill...
 
Last edited:
Who said the 4070 gives better performance than the 6950XT? Guess there are no obvious shills to match your case.

None of this changes the fact that the 6950XT is an old card and based on a radically different architecture than AMDs current cards so will be quickly subjected to diminishing returns and loss of support from AMD.
You're making a lot of suppositions here without any form of evidence. AMD is still supporting Polaris and Vega. RDNA 1/2 are getting plenty of attention.

Going back to my other discussion, I brought up a theoretical 4060 vs a 3090Ti. Would you be saying the same things about nVidia? If not that says a lot about your position.
You're buying a bit extra performance in one case that's going to be worse in every single other. To say otherwise would likely have to be an obvious AMD shill...
15% additional performance is not "a bit". Again, the difference between 60fps and 80fps. Noticeable gains. It's the difference between never having 1% lows dip below 60fps and not.

Performance per watt has been addressed by multiple people.

As for DLSS vs FSR, for all games supporting the latest version of FSR there is no visual difference between the two. Support for AMD/FSR is only going to grow here as graphics cards like Strix Halo in laptops even remove the need to use a discrete graphics card as well as next gen handheld products like a theoretical Steam Deck 2, etc. FSR is also hardware agnostic, meaning it will continue to work well on "old tech" AMD cards. It also has the advantage of working on nVidia cards as well. It's the tech that only has to be done once and support both. Frankly it's better for dev time.

And finally, everything I've said has to do with price to performance. My stance is clear. I even stated above that if the 4070 was $400 I would recommend it to everyone. And I would. nVidia deserves no money because their product stack makes no sense. And it's clear anyone who isn't an nVidia shill agrees with this position. Again, how do we know? Because we have 4070's getting stacked in inventory, regardless of location around the globe, and are not being sold. The 4070 isn't a "bad" card. It's an incredibly poorly priced one.

So you can try and say that it's an issue with "me" or you can actually pay attention to the much wider sentiment about nVidia's price to performance and what cards are actually selling. 'Cause unless it's a 4090, it ain't nVidia right now. And even that card has more or less already saturated the market. Everyone that wants one (or perhaps better said, can afford one) already has it.
 
You're making a lot of suppositions here without any form of evidence. AMD is still supporting Polaris and Vega. RDNA 1/2 are getting plenty of attention.

Going back to my other discussion, I brought up a theoretical 4060 vs a 3090Ti. Would you be saying the same things about nVidia? If not that says a lot about your position.
Polaris/Vega are not really actively being supported outside of a few bug fixes here and there which is about where we're at with RDNA2. Just look at a few months ago when AMD turned all their attention away from the previous cards because the 7000 series was so busted at release. Hell it wasn't that long ago that AMD advertised primitive shader support as coming to Vega drivers for months before saying that it was too hard and flat out abandoning it. But at least now with RDNA3 (or ADA) you're getting the best support that either company is going to offer.

Yes, I already said I wouldn't recommend an older but faster NV card at the same price at least twice...

15% additional performance is not "a bit". Again, the difference between 60fps and 80fps. Noticeable gains. It's the difference between never having 1% lows dip below 60fps and not.

Performance per watt has been addressed by multiple people.

As for DLSS vs FSR, for all games supporting the latest version of FSR there is no visual difference between the two. Support for AMD/FSR is only going to grow here as graphics cards like Strix Halo in laptops even remove the need to use a discrete graphics card as well as next gen handheld products like a theoretical Steam Deck 2, etc. FSR is also hardware agnostic, meaning it will continue to work well on "old tech" AMD cards. It also has the advantage of working on nVidia cards as well. It's the tech that only has to be done once and support both. Frankly it's better for dev time.
I agree that 15% is a good increase but the instances where that is going to be the difference between 60fps or not is going to be at 4k the vast majority of the time, and I don't think anyone buying a 4070 is really looking for 4k. At 1080p/1440p where these buyers are looking the margins are slimmer and I wouldn't be surprised to see the 4070 get much closer as time goes on (maybe not but we'll see).

There have been multiple tests done with FSR vs DLSS and DLSS comes out on top far far more often. FSR is hardware agnostic but FSR3 + FG appears to need RDNA3 and it's rejiggered optical flow sensor. It also doesn't help that you can upgrade older DLSS to the new versions simply by dragging a DLL file into the folder (and apparently GFX does this automatically for some games as well) whereas with FSR you're stuck on old and busted versions unless the dev goes back to update it. So those old FSR games will always look bad while the old DLSS games will continue to improve. So FSR actually makes games worse since you're always getting stuck with an old version.

And finally, everything I've said has to do with price to performance. My stance is clear. I even stated above that if the 4070 was $400 I would recommend it to everyone. And I would. nVidia deserves no money because their product stack makes no sense. And it's clear anyone who isn't an nVidia shill agrees with this position. Again, how do we know? Because we have 4070's getting stacked in inventory, regardless of location around the globe, and are not being sold. The 4070 isn't a "bad" card. It's an incredibly poorly priced one.

So you can try and say that it's an issue with "me" or you can actually pay attention to the much wider sentiment about nVidia's price to performance and what cards are actually selling. 'Cause unless it's a 4090, it ain't nVidia right now. And even that card has more or less already saturated the market. Everyone that wants one (or perhaps better said, can afford one) already has it.
Ok, all you care about is the price to performance while many others who plan on keeping their new card for years will want the whole package. RDNA3 is also not great at price-performance even if they are better than ADA in that metric. I'd like to see some actual data on the 40 series not selling as it seems they're getting into the hands of gamers at a much higher volume than RDNA3.


All I'm saying is that the 4070's Raster/RT/Features/Power usage/etc all look pretty darn good. Sure the 6950XT has better raster but it's worse in every other way. At current pricing I would definitely pick a 4070 up over an older card from either brand. But I even said prior that if you're in the market for one of these $500-600 GPUs you might as well wait and see if AMD is going to drop a 7700XT as it should compare quite favorably (6900/6950XT performance at $500ish with modern features/upgrades sounds damn good to me).
 
15% additional performance is not "a bit". Again, the difference between 60fps and 80fps.
No, it isn't. It's 10 percent more performance, aka 60 vs 66 fps.

Your math is wrong also, you're giving 33% more performance numbers, not 15. Which it isn't even 15, it's 10.
 
No, it isn't. It's 10 percent more performance, aka 60 vs 66 fps.

Your math is wrong also, you're giving 33% more performance numbers, not 15. Which it isn't even 15, it's 10.
Ehh, overall it's like 10-15% faster at 1080p/1440p and 20% at 4k going by TechSpot. TPU's numbers for the 6950XT are 5-10% higher than the 6900XT so overall I'd say it would work out to 10-15% faster on average over the 4070.

I wouldn't really consider the 4070 as a 4k card though and it compares more favorably at 1080p/1440p where you'd expect the buyers of the card to be.
 
Polaris/Vega are not really actively being supported outside of a few bug fixes here and there which is about where we're at with RDNA2. Just look at a few months ago when AMD turned all their attention away from the previous cards because the 7000 series was so busted at release. Hell it wasn't that long ago that AMD advertised primitive shader support as coming to Vega drivers for months before saying that it was too hard and flat out abandoning it. But at least now with RDNA3 (or ADA) you're getting the best support that either company is going to offer.
People use the cards they have. The RX580 has lived a long life as has the 1660Ti. None of these cards have failed in performance relative to what they were capable of in the first place.

All of these cards are supported.
Yes, I already said I wouldn't recommend an older but faster NV card at the same price at least twice...
Then I'd say you're foolish. Just talk to a 1080Ti owner who bought at launch and still is using the card today.
I agree that 15% is a good increase but the instances where that is going to be the difference between 60fps or not is going to be at 4k the vast majority of the time, and I don't think anyone buying a 4070 is really looking for 4k. At 1080p/1440p where these buyers are looking the margins are slimmer and I wouldn't be surprised to see the 4070 get much closer as time goes on (maybe not but we'll see).
A $600 card should give the ability to game in 4k. The RX6950XT was the top card for AMD last gen and people who were on AMD used it for that purpose. As they did with the 3090 and 3090Ti.

If nVidia didn't gimp the 4070 and 4070Ti with 12GB of RAM, it would very likely be a 4k capable card. In fact a good chunk of the reason why the 4070 and Ti sit where they do are specifically because nVidia is being cheap.
There have been multiple tests done with FSR vs DLSS and DLSS comes out on top far far more often. FSR is hardware agnostic but FSR3 + FG appears to need RDNA3 and it's rejiggered optical flow sensor. It also doesn't help that you can upgrade older DLSS to the new versions simply by dragging a DLL file into the folder (and apparently GFX does this automatically for some games as well) whereas with FSR you're stuck on old and busted versions unless the dev goes back to update it. So those old FSR games will always look bad while the old DLSS games will continue to improve. So FSR actually makes games worse since you're always getting stuck with an old version.
Maybe yes, maybe no. It's dependent on the dev and what thy want to implement.

However moving forward to UE5, FSR and DLSS will look the same as it's Epic doing all of the work to support things in engine. And again with 16GB of RAM vs 12GB, it's going to make a big difference as geometry, shaders, and textures are only going up.
Ok, all you care about is the price to performance while many others who plan on keeping their new card for years will want the whole package. RDNA3 is also not great at price-performance even if they are better than ADA in that metric. I'd like to see some actual data on the 40 series not selling as it seems they're getting into the hands of gamers at a much higher volume than RDNA3.
Subjective.
All I'm saying is that the 4070's Raster/RT/Features/Power usage/etc all look pretty darn good. Sure the 6950XT has better raster but it's worse in every other way.
You've said this twice, and I would disagree. You should checkout the thread where they up 16GB of RAM on a 3070Ti. It's "sad" that in a single generation nVidia has made cards that are irrelevant due to greed.

It's relevant here because a 6800XT games faster in RE4 than a 3070Ti 8GB, specifically even with RT on! And that should never be the case, and it came down to vRAM. I would say that issue is going to continue rearing its head this generation. The vRAM is in fact also giving a quality difference in its images as well, because it's the difference between Medium and High, or High and Ultra. And in some games it's the difference between having all of your textures present and constant LOD pop-in. With the only fix simply being lowering things like texture quality (that was what the Witcher 3 patch did).

That same pattern I believe is going to continue on with the 4070 and Ti being stuck at 12GB. That final 4GB whether you argue it's due to optimization or not, is going to make very real performance and image quality differences.

Even with "balanced" performance settings, I'd say that AMD wins more often that not here, when using the RT features that make the biggest IQ improvements and turning off improvements that are less noticeable like RT reflections.
At current pricing I would definitely pick a 4070 up over an older card from either brand. But I even said prior that if you're in the market for one of these $500-600 GPUs you might as well wait and see if AMD is going to drop a 7700XT as it should compare quite favorably (6900/6950XT performance at $500ish with modern features/upgrades sounds damn good to me).
You're welcome to your opinion here. I obviously disagree.

As for sales numbers, all of that is coming up through quarterly reports and basically all the inside information showing cards sitting on shelves. GN even talked to people in Taipei as an example and stated that cards aren't selling. As have employees at places like MicroCenter and Best Buy.

Even here, the Hardforums, the place where people would buy the most computer stuff in general, the sentiment is mostly against nVidia. I would simply say you just have to "read the room".
 
Last edited:
If just gaming, AMD's existing RX 6xxx seems to win on value, even if less efficient power wise.

Today, feature wise, not sure anyone is giving Nvidia a run for the money. But I see Intel ahead of AMD today. So, for Nvidia hold outs, probably holding out for something that will simply not exist (talking non-Nvidia solutions). You're moving to 40 (likely) series, you just don't know it yet. Just be careful because Nvidia may have "poor value" cost wise, but things like the 4060 may set a new low bar with regards to lack of value. Be warned. That is, there are bad Nvidia deals, and really really really bad ones (4060 may be "cheaper", but less overall value).

Lastly, and it depends, but really old Nvidia card holders (that need Nvidia features) might find be looking for "better value" in the 30 series (used), but, they are overpriced (let the buyer beware). Right now, IMHO, not a great time to be an Nvidia user. Would love to see the 30 series (used) drop greatly in price, then there's a clearer path for slightly updating and waiting (for pricing sanity). Perhaps it's the ultra-high prices of 40 that's keeping 30 used pricing inflated today (?).

Seems if you're team green, green == money. Just the way it is today.
 
Here's Steve at HWUB saying pretty much exactly what I've been saying. The 6950XT is fine but not really compelling as it's old tech.




If you go to MicroCenter and talk to the employees they'll tell you that the 4000 series is selling about like the older cards did (cards didn't sell out before unless they were very recently released or good for mining) and that the 7900 series is seeing a lot returns. One of the guys in the department stated that they've seen a ton of 7900 returns due to coil whine and/or temps with many of those buyers moving to 4080s/90s. I personally know multiple people that had to wait for stock to snag 4080 FEs from BB as recently as a week ago.


https://www.techspot.com/community/...valves-steam-survey-returns-to-normal.280277/

The most recent April results put the GTX 1650 back on top thanks to its 2.15% increase, while the GTX 1060 returned to second place. Elsewhere, the RTX 3060 and RTX 2060 saw losses of -6.01% and -3.6%, respectively. But one thing that hasn't changed is the absence of the AMD Radeon RX 7900 XT and XTX from the main GPU table. Looking at the month's biggest gains, the GTX 1650 is top here, too. The best-performing Lovelace product is the RTX 4060 laptop GPU (a new entry), though all RTX 4000-series cards on the list made gains.
 
Last edited:
Sold the 3080Ti, keeping the 6900XT. 12GB I see limiting or likely down the road. I don’t care if bad ports or not, rather have something that doesn’t have to be babied or expect somehow developers will magically make perfect ports.

Also doesn’t matter if RT is better, the 3080TI is not fast enough for anything significant anyways.

12gb 4070, price is too high, $449 would be OK for now.
 
Last edited:
My 2080ti just died today so now I'm forced to spend the cash. I must admit now that I'm without a modern GPU the 6950XT at $630 is a banger deal. But at the same time the 4070 is newer and I'm already having black screen flashes on the old RX570 that I have installed now. For another $200 the 4070ti is pretty nice but the 7900XT is like 8% faster at the same price. But if one is already going to drop $800 then $1000 for the 7900XTX isn't looking so bad for the best that AMD has. Still if you're going to drop $1k then another $200 for a 4080 should be doable. Yet if you're going to go that far then another $300 to get a 4090 doesn't sound so bad....
 
Am I the only one here who does not give a damn about power consumption?
Well I do, but not in the use case sense, but jthat likely I might require a new PS over my 750W Seasonic. The price for a new PS when my current one works perfect is a bit hard to swallow these days. Are people mining on PS these days? /s
 
My 2080ti just died today so now I'm forced to spend the cash. I must admit now that I'm without a modern GPU the 6950XT at $630 is a banger deal. But at the same time the 4070 is newer and I'm already having black screen flashes on the old RX570 that I have installed now. For another $200 the 4070ti is pretty nice but the 7900XT is like 8% faster at the same price. But if one is already going to drop $800 then $1000 for the 7900XTX isn't looking so bad for the best that AMD has. Still if you're going to drop $1k then another $200 for a 4080 should be doable. Yet if you're going to go that far then another $300 to get a 4090 doesn't sound so bad....
LoL! Yes modern problems
:D
But why the black screens with the 570? I'm more concerned with that. 2080ti dies RX570 flashing black? Get her fixed dude.
 
Am I the only one here who does not give a damn about power consumption?
I do if it means having to spend another $200+ on a good PSU, and then having to go through the hassle of replacing my old one, cable management, etc.

For people running on 600W-750W power supplies-the 6950XT is a basically a no-go, and that $200+ difference on top of the GPU all of a sudden puts the 6950XT at 4070Ti prices, with less performance. At that point just get a 4070Ti and save yourself the headache that comes with swapping out a power supply.
 
I care because of the heat output in my room. It sucks having a high wattage card in the summer...
Yup. Our "office" room is usually a good 10 degrees warmer than the hallway right outside the door. 2 humans, 2 pcs, a little file server, 3 3d printers, and assorted networking gear in the closet.....the watts add up.
 
4070 to 6950XT looks more like a 10% difference. The 4070 is also quieter, cooler, smaller, and much more power efficient. Then add in DLSS 3, Ray Tracing about equal to a 7900XT, and the usual Nvidia extras and I'd take the 4070 over the 6950XT any day of the week.
For ONLY GAMING, the 6950 XT is better but not as efficient - the 4070 is slightly 'inferior' in gaming but can be a doable card in productivity while being very low power or power efficient plus offer more features - AV1 encoding and decoding - unless all you do is play games - the 6950 XT is a hard sell or difficult to convince me to get - and I want to go AMD.
The productivity options that AMD provides is really wanting. The 4070 is a 'crippled' card and shame on Nvidia but the AMD cards especially in the 6000 series are not providing enough in that feature set area - to be very enticing so that point should be out there but few ppl look at it. Why?
 
Back
Top