AMD Plays the VRAM Card Against NVIDIA

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,894
Idk what this is all about but seems to be causing quite the stir elsewheres

“AMD does have a point here, but as the company has as yet to launch anything below the Radeon RX 7900 XT in the 7000-series, AMD is mostly comparing its 6000-series of cards with NVIDIA's 3000-series of cards, most of which are getting hard to purchase and potentially less interesting for those looking to upgrade their system. That said, AMD also compares its two 7000-series cards to the NVIDIA RTX 4070 Ti and the RTX 4080, claiming up to a 27 percent lead over NVIDIA in performance. Based on TPU's own tests of some of these games, albeit most likely using different test scenarios, the figures provided by AMD don't seem to reflect real world performance. It's also surprising to see AMD claims its RX 7900 XTX beats NVIDIA's RTX 4080 in ray tracing performance in Resident Evil 4 by 23 percent, where our own tests shows NVIDIA in front by a small margin. Make what you want of this, but one thing is fairly certain and that is that future games will require more VRAM, but most likely the need for a powerful GPU isn't going to go away.”

1681306747446.png

Source: https://www.techpowerup.com/307110/amd-plays-the-vram-card-against-nvidia
 
Good on AMD to finally take advantage of Nvidia's reluctance to include more vram. Nvidia releasing the RTX 4070 does give reviewers a chance to test AMD's newer drivers which do seem to have given their GPU's better performance. That said, if Intel can include 16GB of VRAM on their under $400 GPU....
 
How about lowering your prices and fixing the idle power consumption on your latest cards before you talk shit? Amazing to me they can release buggy shit year after year, over charge for it, and talk shit. I like the 6800 XT a lot, but it still has issues I didn't have with my 3060 ti.
 
I wish their Windows drivers were better. Would instantly buy one of their high memory cards.

Ironically the open source Linux driver seem to be very good now, but that doesn't help my Windows gaming machine with older games. And there is no ffmpeg (video encoding) acceleration under Linux with AMD GPUs.
 
How about lowering your prices and fixing the idle power consumption on your latest cards before you talk shit? Amazing to me they can release buggy shit year after year, over charge for it, and talk shit. I like the 6800 XT a lot, but it still has issues I didn't have with my 3060 ti.
I have a 6950 xt it idles (i got it for $700) around 29 watts and it's not been buggy at all, definitely no more than any other card I've owned and my Steam library is pretty damn big.
 
not to mention them neutering the memory bus which gives the 4070ti, which was suppose to be the 4080 half the bandwidth of it's big brother and it rolls down hill from there.

bw1.jpg

bw2.jpg


look at that 4060ti... OUCH!!! i'm getting 384.6 GB/s on a 980ti from 2014! *edit: and they branding it RTX like you'd really be able to run ray tracing with that... prob got 4k plastered all over the box too

how much you want to bet this is gonna be one of those series that sticks around for a loooonng time. because they'll do a "super" refresh, or whatever they're gonna call it this time, with slightly higher clocks and upgraded memory specs that we should have gotten out of the gate?
 
I have a 6950 xt it idles (i got it for $700) around 29 watts and it's not been buggy at all, definitely no more than any other card I've owned and my Steam library is pretty damn big.
The 7900 XT and XTX have bad idle power usage when using more than one monitor (like 100 watts plus). My 6800 XT still flickers occasionally on my 4K monitor only. It's a great card, but it still has issues.
 
It's $609 with $20 promo code VGAEXCAA496. Nvidia fucked up.
We'll see if it sells. People will still buy the 4070 in larger numbers than thr 6950 xt. No one buys AMD. Even with an objectively superior software stack, the performance difference here is significant.
 
Last edited:


I posted this video in the 4070 thread... but seems like it fits here as well.
That Nvidia is going to try and push a 4060TI with 8gb of ram is beyond laughable imo. Even at 1080p its just not usable anymore unless your playing 4 year old games.
 
The 7900 XT and XTX have bad idle power usage when using more than one monitor (like 100 watts plus).
Pretty sure that was fixed a while ago. Nvidia cards have had a similar issue to that as well.
My 6800 XT still flickers occasionally on my 4K monitor only. It's a great card, but it still has issues.
My 6950 never flickers. Check your cables.
 
Pretty sure that was fixed a while ago. Nvidia cards have had a similar issue to that as well.

My 6950 never flickers. Check your cables.
I've replaced my cables twice now, it still flickers. The Idle power usage has not been fixed.
 
I've replaced my cables twice now, it still flickers. The Idle power usage has not been fixed.
Multi-monitor power usage is high on nvidia cards as well. I should know I run a spare monitor next to my oled.

When it comes to flickering, my 3090 flickered for a long time. Figured it was an nvidia problem. Nope you need high quality cables to run 4k120hz with HDR. You buy cheap….well you know the rest.
 
Multi-monitor power usage is high on nvidia cards as well. I should know I run a spare monitor next to my oled.

When it comes to flickering, my 3090 flickered for a long time. Figured it was an nvidia problem. Nope you need high quality cables to run 4k120hz with HDR. You buy cheap….well you know the rest.
I bought expensive display port 2.0 cables, I'll try another kind if you have a recommendation
 
I bought expensive display port 2.0 cables, I'll try another kind if you have a recommendation
before even going that far. Have you checked to see if the monitor or GPU has a bios update? That has happened in the past with numerous video cards. Sometimes was it was a 3rd party bios issue. Sometimes it was a AMD/Nvidia issue. Brown screen of death. EVGA 460's etc etc.
 
I have a 3070, but I play with an RX 6700 10Gb in Fortnite in 1440p or 4k and recorded with Re Live.

 
before even going that far. Have you checked to see if the monitor or GPU has a bios update? That has happened in the past with numerous video cards. Sometimes was it was a 3rd party bios issue. Sometimes it was a AMD/Nvidia issue. Brown screen of death. EVGA 460's etc etc.
Yeah i'm on the latest firmware for my monitor if that's what you mean. Latest video drivers, card is 5-6 months old, latest vbios.
 
Last edited:
before even going that far. Have you checked to see if the monitor or GPU has a bios update? That has happened in the past with numerous video cards. Sometimes was it was a 3rd party bios issue. Sometimes it was a AMD/Nvidia issue. Brown screen of death. EVGA 460's etc etc.
There's just so many things than can go wrong in the pipeline from game software to what you see on the screen that people don't bother to isolate half of it and instead go "company X sux!". Shoot, I thought my 6800XT was giving me issues, almost bought into the usual "it's the drivers"...nope. Was the motherboard giving me instability. Card works just fine with a new am5 setup.
 
Pretty crazy that everyone's take on this is in this thread is 'NV is screwing everyone over on VRAM' and not 'AMD is outright lying about performance in their graphs'.
 
Pretty crazy that everyone's take on this is in this thread is 'NV is screwing everyone over on VRAM' and not 'AMD is outright lying about performance in their graphs'.
No one is saying that amd didn’t lie. Difference is we know the 7900 series failed and amd did lie. But at least they aren’t running out of vram in games.

Also that multi-monitor benchmark you posted is old and AMD fixed the issue
 
Last edited:
It's better but still much worse than any other GPU ever...
power-multi-monitor.png


Then you have stuff like Video playback power consumption and much worse (higher) power spikes and 60hz-locked results:
power-video-playback.png
 
It's better but still much worse than any other GPU ever...
View attachment 563923

Then you have stuff like Video playback power consumption and much worse (higher) power spikes and 60hz-locked results:
View attachment 563924

So you went and found the updated graphs. lol
I don't think anyone said AMD had a perfect product.
But if Car manufacture A has a recall.... that doesn't mean Car Manufacturer B gets to put out shit because hey the other guy had a recall you know.

AMDs latest gen is first gen chiplet GPU... yep it has some issues. The multi monitor power draw now however is not an issue. Its 6 watts more then the Nvidia card it was released to compete with... and yes Nvidia has gotten it down further with the 4090. No one is disputing that AMD isn't taking that win... but its not the 100 watts it was with the launch driver. Again video playback power use isn't far off the 3090ti the 7900s where built to compete with you can't compare 7900 to the 4070 they are different class of card completely. We haven't seen AMDs 4070 competitor yet. I believe its a non chiplet... so it will probably draw power closer to the 6700 which draws ONE watt more then the 4070 which is margin of error.

I'll take a bit more power draw watching videos (again not that much more when you compare apples to apples) if it means I'm not going to be running out of VRAM in games.
 
Last edited:
The funniest part is you saw no one complain about Nvidia power usage in multi-monitor last gen. It's only a big deal now since its new gen!

Look at the 3090 and 3090ti!! lol PEople will find ANYTHING to show bad now a company is.
 
Last edited:
The funniest part is you saw no one complain about Nvidia power usage in multi-monitor last gen. It's only a big deal now since its new gen!

Look at the 3090 and 3090ti!! lol PEople will find ANYTHING to show bad now a company is.
It's because they were basically the same LAST gen.
 
So you went and found the updated graphs. lol
I don't think anyone said AMD had a perfect product.
But if Car manufacture A has a recall.... that doesn't mean Car Manufacturer B gets to put out shit because hey the other guy had a recall you know.

AMDs latest gen is first gen chiplet GPU... yep it has some issues. The multi monitor power draw now however is not an issue. Its 6 watts more then the Nvidia card it was released to compete with... and yes Nvidia has gotten it down further with the 4090. No one is disputing that AMD isn't taking that win... but its not the 100 watts it was with the launch driver. Again video playback power use isn't far off the 3090ti the 7900s where built to compete with you can't compare 7900 to the 4070 they are different class of card completely. We haven't seen AMDs 4070 competitor yet. I believe its a non chiplet... so it will probably draw power closer to the 6700 which draws ONE watt more then the 4070 which is margin of error.

I'll take a bit more power draw watching videos (again not that much more when you compare apples to apples) if it means I'm not going to be running out of VRAM in games.
Yes I went out pulled the current graphs. Like I said I was previously looking at 6950xt charts and the 4070 charts at the same time. Is one not supposed to correct themselves?

I didn't say anything about AMD or NV being perfect or not perfect. The 7900 is meant to compete with the 40xx where it's over double the usage. Saying that it's meant to compete with the 30xx series is ridiculous.
 
Last edited:
Literally complaining over 17-22w difference. Of all the things to make a fuss about that has to be the silliest.

Quick, better put in a percentage to make it sound huge.
Anything to make Nvidia look better! Even if its 17w! LOL
 
Literally complaining over 17-22w difference. Of all the things to make a fuss about that has to be the silliest.

Quick, better put in a percentage to make it sound huge.
In this case it does matter since it affects idle. It matters for 24/7 on machine.
 
In this case it does matter since it affects idle. It matters for 24/7 on machine.
Might as well say 5w makes a difference. This isn't an extra 60-100w. It's 20.

Maybe if you live in one of those energy deprived areas with 60 cent/kwH rates that could add up.

It's still by far the weakest thing to nitpick about unless you're one of those people who get excited about chasing down every percent of efficiency.
 
The 7900 XT and XTX have bad idle power usage when using more than one monitor (like 100 watts plus). My 6800 XT still flickers occasionally on my 4K monitor only. It's a great card, but it still has issues.
I have this issue with my RC6800, but only to my monitor on the HDMI port, does a random flicker, reading from past it was an issue on their 5 series as well they claimed to have fixed but appears never did. Since I run 4 monitos, I have no plans to get a 7 series card, not to mention my rx6800 handles what I need it to anyways.
 
Pretty crazy that everyone's take on this is in this thread is 'NV is screwing everyone over on VRAM' and not 'AMD is outright lying about performance in their graphs'.
Holy shit, I thought I was the only one to notice and roll my eyes at AMDs chart. What the fuck were they smoking when they made it?
CIMwXdW9AqkrZm9r.jpg

borderlands-3-3840-2160.png
cyberpunk-2077-3840-2160.png
f1-2022-3840-2160.png
far-cry-6-3840-2160.png
forza-horizon-5-3840-2160.png
god-of-war-3840-2160.png
resident-evil-village-3840-2160.png
watch-dogs-legion-3840-2160.png


And those are just the non-RT benches they are lying about. The RT benches they put in their chart as winning are just a disgusting straight out lie! Even when the 7900XT keeps up, it's margin of error between the two, and yet they have games on the chart stating they are up to 23% ahead of the 4080 in RT.

far-cry-6-rt-3840-2160.png


How is this not bothering anyone else? Had it between any of their competitors, this forum would be out for blood. :coffee:
 
Farcry 6 is a garbage metric for Ray Tracing anything, it is such a shit and jank implementation of the tech that it should be the poster child on how not to do something.
But yes those slides are garbage, obviously and intentionally misleading.
And conducted using some sort of best for us vs worst for them testing.
And it should be noted that in Canada here the price difference between the 4080 and the 7900XTX is about $200 CAD, which is substantially less than $200 USD as neither of those cards is being sold up here at MSRP, as rebates ranging from $60 to $230 exist for the 4080s but the 7900 xtx is at or above MSRP.
 
And those are just the non-RT benches they are lying about. The RT benches they put in their chart as winning are just a disgusting straight out lie! Even when the 7900XT keeps up, it's margin of error between the two, and yet they have games on the chart stating they are up to 23% ahead of the 4080 in RT.
Yeah the non RT ones seem whack. However that FC6 RT chart is actually better than the amd chart? 7900XTX is 17% higher than the 4080 (104/89...am I doing that right?) Chart says 12%.

*Edit* Some of the non-RT ones are fine, too. Sometimes also better than the AMD chart.

performance-matchup.png
 
Last edited:
Back
Top