RDNA2 vs Ampere Gaming Rasterization Performance Compilation

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
7,262
This is AMD presented performance. Compiled data into a single chart for comparison, one for 1440p and one for 4K. Since these were tested on a Zen 3 platform with Smart Access Memory enabled, results could vary a lot on different configurations.
Source: https://www.amd.com/en/gaming/graphics-gaming-benchmarks

Looking at 1440p, which is not an OC condition, AMD is looking really good, first chart is at 1440p where even the Radeon RX 6800 can even beat the 3090 in two games! The 6800 XT wins in four and 6900 XT in seven games!

2560x1440corrected.png


At 4K, where Ampere can more effectively use the double FP32 unit in the SMs, it performs better but still AMD is kicking ASS with the 6900 XT winning in five! For a much lower price point.

3840x2160Corrected.png



If these RDNA2 card OC better, something that the Ampere cards do not do well, will just be more syrup over the Ice Cream.

While AMD were accurate and maybe more conservative with Zen 3, still need way more game tests, raytracing tests for those more interested, driver state, any odd things and so on. Anyways from the above prospective AMD is looking hot at this point. Comments on what you expect if you would, I am eagerly waiting for actual independent reviews.

System Configuration​

CPUAMD Ryzen 9 5900XBF VUltra
System Memory16GB DDR4-3200MhzBorderlands 3Badass
MotherboardX570 Reference PlatformCOD Modern WarfareUltra-Filmic SMAA T2x 16XAF
System BIOSRQ21082B - AMD Smart Access Memory EnabledThe Division 2Ultra TAA SS High 16XAF
OSWin10 Pro x64 19041.508Doom EternalUltra Nightmare
Radeon Driver Version20.45-201013nForza Horizon 4Ultra
GeForce Driver Version456.71Gears 5Ultra
Resident Evil 3Ultra FXAA+TAA 16XAF
Shadow of the Tomb RaiderHighest noAA
Wolfenstein YoungbloodMein Leben
 
Last edited:
While these look promising

I don't believe them for a second.

IF AMD had this level of performance in the chamber, we would have heard about it, and AMD would have used RT in their comparison benches. [/B]
 
While these look promising

I don't believe them for a second.

IF AMD had this level of performance in the chamber, we would have heard about it, and AMD would have used RT in their comparison benches. [/B]
i KNOW. Like AMD is not even a real company as they sell rebranded 2nd stock garbage leftovers. Just one guy in a sweatshop stenciling AMD.
 
Let’s wait for reviews. Raytracing ≠ rasterization so performance on one doesn’t dictate the other. And just because they didn’t boast about performance doesn’t mean it won’t be there (see every single ryzen generation).
 
While these look promising

I don't believe them for a second.

IF AMD had this level of performance in the chamber, we would have heard about it, and AMD would have used RT in their comparison benches. [/B]
I am unsure the nuance between everyone reading it on the AMD website and AMD presentation about it and people hearing about it.

Why do you not believe AMD rasterization performance benchmark from AMD because they would have used RT in their comparison benches if those numbers where true, if AMD numbers look betters without RT but not with RT maybe they would hide those.

Now there where the speculation goes, I think many would have felt 2 months ago that if AMD was just a bit slower in RT and faster without it, that they would have been really open about it and shown it, so that the fact they hide them mean a lot.

That was because it was a bit of a pipe dream that they would be the faster 4K cards, now that it is a fact not having the but slower in RT (even it is not by much) getting around could be a strategy, I feel we will have to wait and that we cannot conclude much about how much slower they will be in RT (or not slower and not looking has good)
 
i KNOW. Like AMD is not even a real company as they sell rebranded 2nd stock garbage leftovers. Just one guy in a sweatshop stenciling AMD.

Not that, its just that AMD shows itself in the best light. If the RX6K series beats Nvidia in RT games, they would be showing that in the dozens of benchmarks they've made public.
 
I am unsure the nuance between everyone reading it on the AMD website and AMD presentation about it and people hearing about it.

Why do you not believe AMD rasterization performance benchmark from AMD because they would have used RT in their comparison benches if those numbers where true, if AMD numbers look betters without RT but not with RT maybe they would hide those.

Now there where the speculation goes, I think many would have felt 2 months ago that if AMD was just a bit slower in RT and faster without it, that they would have been really open about it and shown it, so that the fact they hide them mean a lot.

That was because it was a bit of a pipe dream that they would be the faster 4K cards, now that it is a fact not having the but slower in RT (even it is not by much) getting around could be a strategy, I feel we will have to wait and that we cannot conclude much about how much slower they will be in RT (or not slower and not looking has good)


I read the title wrong, I thought these were "leaked" RT numbers, not Raster-only.

Raster only makes MUCH more sense, and I believe them.
 
Not that, its just that AMD shows itself in the best light. If the RX6K series beats Nvidia in RT games, they would be showing that in the dozens of benchmarks they've made public.
AMD doesnt have to beat. It just has to COMPETE. I couldn't give a crap about a few FPS in raytracing. It's like the intel guys on "we're better in games" by meaningless few FPS. So what? Which btw doesnt happen any more.
 
AMD doesnt have to beat. It just has to COMPETE. I couldn't give a crap about a few FPS in raytracing. It's like the intel guys on "we're better in games" by meaningless few FPS. So what? Which btw doesnt happen any more.
You seem to be talking about 2 different thing, what does it has to do for you to buy a card vs what it has to do for AMD marketing team deciding to show it in the result.

I am not sure that AMD just has to compete to not keep the lower score secret (i.e. I would not conclude that they are much slower just by the secrecy)
 
You seem to be talking about 2 different thing, what does it has to do for you to buy a card vs what it has to do for AMD marketing team deciding to show it in the result.

I am not sure that AMD just has to compete to not keep the lower score secret (i.e. I would not conclude that they are much slower just by the secrecy)
I don't know if english is a foreign language to you but I really can't decipher exactly what you are trying to convey.
 
I don't know if english is a foreign language to you but I really can't decipher exactly what you are trying to convey.
That a good intuition (about the second language)

I will try to keep it simple, someone is saying that :
If the RX6K series beats Nvidia in RT games, they would be showing that in the dozens of benchmarks they've made public.
To which you are answering:
AMD doesnt have to beat. It just has to COMPETE.

Implying that AMD is not competing with NVidia on RT performance otherwise they would be making it public, because they do not need to beat them for it to still be a good sales pitch to include RT performance.

I disagree, you could be right, but maybe they are competing and being close enough. But like I said maybe you are talking about something else to the message you quoted and talking about something else, your personal preference and just saying that for you just competing in RT is more than enough.
 
That a good intuition (about the second language)

I will try to keep it simple, someone is saying that :
If the RX6K series beats Nvidia in RT games, they would be showing that in the dozens of benchmarks they've made public.
To which you are answering:
AMD doesnt have to beat. It just has to COMPETE.

Implying that AMD is not competing with NVidia on RT performance otherwise they would be making it public, because they do not need to beat them for it to still be a good sales pitch to include RT performance.

I disagree, you could be right, but maybe they are competing and being close enough. But like I said maybe you are talking about something else to the message you quoted and talking about something else, your personal preference and just saying that for you just competing in RT is more than enough.
When i say AMD just had to compete I mean EXACTLY that. i am NOT implying that AMD is not competing in the 6k gen. AMD is most likely not pushing RT numbers because they are losing. I *dont* think by much just guessing by raw horsepower numbers we've seen. Sony and MS are both betting on AMD RT performance in their next gen games. Seriously doubt they would accept AMD handing them garbage hardware for their consoles and next gen games to fall flat on their face.

In the grand scheme of things, the small performance difference between AMD and NV probably won:t matter. Nor should it really. Everyone is so hungry for high performance GPUs and CPUs. You stuck gold by just getting something decent nowadays,
 
Last edited:
Now that we have the initial reviews out, how does RNDA2 pan out against Nvidia Ampere? Using Hardware Unboxed since they have a wider compiled data set of games plus some initial SAM (Smart Ass Memory :D) testing. The average graphs below is from their 3950x system, stock 6800 XT settings, no SAM, no OC, no Rage Mode, which they will update to the 5950x CPU in the future. The SAM tests were done with the 5950x but limited. Good video to watch.

1080p averages, 6800 XT is faster than the 3090:

1080pAvg.jpg
1440p, the 6800 XT sits between the 3090 and 3080:

1440pAvg.jpg
4K, the 6800 XT falls behind the 3080:

4Kavg.jpg
SAM testing in Valhalla was remarkably effective, this was with the 5950x, the 6800 XT was an incredible 40% faster than the 3080 at 1440p! Please note other sites have other SAM testing and some games there is virtually zero performance gain, this game shows a lot:

SAM_Valhalla.jpg
Many just assume Nvidia will be faster in RT, I would not use older non DXR 1.1 games as a good comparison which also were designed around RTX. Dirt 5, using RT, AMD sponsored title looks like it is optimized well with RNDA2 RT hardware. Anyways in this case RNDA2 has surprising results but I would caution that this is too early to really tell full capability for RT gaming with either Ampere or RNDA2. 6800 XT was 49% faster than the 3080 in this title. Will Nvidia get RT performance up in this title, game updates? Maybe:

1440pRTdIRT5.jpg


 
So Hardware Unboxed shows the XT a little ahead over their 18 game averages for everything under 4k, Tech Powerup shows the XT a few % behind in all resolutions. For my 3440x1440p 100hz UW, it will be game by game which one is better.

First one I can find with an AIO will be my upgrade from my 1080ti Hybrid. Similar pricing, similar performance. Just need one to buy!
 
  • Like
Reactions: noko
like this
So Hardware Unboxed shows the XT a little ahead over their 18 game averages for everything under 4k, Tech Powerup shows the XT a few % behind in all resolutions. For my 3440x1440p 100hz UW, it will be game by game which one is better.

First one I can find with an AIO will be my upgrade from my 1080ti Hybrid. Similar pricing, similar performance. Just need one to buy!
The 6800XT and 3080 seem to pretty much perform about the same overall in rasterization. With some games favoring one or the other.
 
So Hardware Unboxed shows the XT a little ahead over their 18 game averages for everything under 4k, Tech Powerup shows the XT a few % behind in all resolutions. For my 3440x1440p 100hz UW, it will be game by game which one is better.

First one I can find with an AIO will be my upgrade from my 1080ti Hybrid. Similar pricing, similar performance. Just need one to buy!
Gamers Nexus had a much more limited range of games tested but they were able to OC to like 5%, hope to see more OCing results. Anyways the 16gb vs the 10gb for me is the selling point, which today's games that may matter little overall. I am sure we will see more OCing tests including water cooling and liquid Nitrogen in the coming weeks. Plus not sure anyone tested broadly with like a 5950x with SAM for all the games, that too I think will be more broadly covered over time.
 
Let’s wait for reviews. Raytracing ≠ rasterization so performance on one doesn’t dictate the other. And just because they didn’t boast about performance doesn’t mean it won’t be there (see every single ryzen generation).
Ray tracing performance won't matter in a few generations. I don't find games enjoyable at sub 100fps.
 
  • Like
Reactions: noko
like this
Gamers Nexus had a much more limited range of games tested but they were able to OC to like 5%, hope to see more OCing results. Anyways the 16gb vs the 10gb for me is the selling point, which today's games that may matter little overall. I am sure we will see more OCing tests including water cooling and liquid Nitrogen in the coming weeks. Plus not sure anyone tested broadly with like a 5950x with SAM on for all the games, that too I think will be more broadly covered over time.
Yeah, the SAM effect makes me a little nervous. I like AMD's new cpus, but my 8086k at 5.1 is no slouch and I have no real need to upgrade there. If Nvidia implements their own SAM as rumored and gains back 5% or whatever then it gets even more interesting.
 
The tldr is 3080 and 6800xt are virtually identical. The 6800xt wins a bit more 1080p and in price while the 3080 has the rt edge if you want to play games like control or wd legion.

Sam and rage mode are kinda moot points 2% to 5% and rage fan noise will be an issue to some. Oc has diminishing returns, better avg will trump raw mhz peak.
 
Ray tracing performance won't matter in a few generations. I don't find games enjoyable at sub 100fps.
I can certainly see why you prefer 100hz games - since I upgraded to a 95hz display, I do really notice when games run at higher framerates. That said, as long as DXR capabilities are good enough for 1080, I'm happy with that, no need to play at QHD.
 
Back
Top