XFX Speedster Zero WB Radeon RX 6900 XT

Zarathustra[H]

Extremely [H]
Joined
Oct 29, 2000
Messages
38,850
So,

I read about this video card over at TheFPSReview.

I hadn't planned on going AMD this generation, but through some miracle of fortune this one was in stock at AIB MSRP of $1,799 over at Newegg, so after a few seconds of hesitation I bit.

I had really wanted to go with a 3090 or 3080ti this time around, but I wasn't willing to pay $3 grand for the 3090, and 3080ti availability at least was a bit difficult.

It comes with the specialty binned XTX version of the 6900xt, a totally reworked VRM/power stage to optimize for water cooling, and three 8 pin ports, so hopefully it will make for a good overclocker. Buzz about the card suggests 3GHZ is possible. I figured if this thing can hit 3Ghz as has been suggested, at least if you don't factor in raytracing it might even beat a 3090.

So I bit.

And for some odd reason it is still in stock. Maybe there just aren't enough custom loop builders among the miner crowd to make this fly off the shelves like air cooled models.

It arrived today in a much larger box than expected, and I figured I'd share some details.

PXL_20211006_215411504.jpg


This thing is surprisingly heavy. Much more so than I remember my Pascal Titan X with block installed being....


(Though I haven't done a side by side as it is still going to be installed until I'm done cleaning the new one.

PXL_20211006_220106551~2.jpg



The back plate is metal and like 1/8 thick throughout. It's like the damn thing is armored. I'm guessing that's where most of the excessive weight comes from.

PXL_20211006_220516535~2.jpg


PXL_20211006_220658372.jpg


Here's hoping the motherboard PCIe slot can handle the weight...

I cannot understate how surprisingly heavy this thing is.

Normally I like that, it makes it feel sturdy and of quality construction, but this may be a little bit much, even for me...



Sadly the RGB LED cable is not detachable from the block, so you are stuck with it whether or not you want it...

PXL_20211006_220946004~2.jpg


(Unless... You know... <snip> <snip>. But that probably impacts warranty and resale value. I guess I'll have to try to hide it somehow.)


I'm so disconnected from the PC case Christmas tree lighting world that I don't even know what kind of connector this is:

PXL_20211006_221230595~2.jpg



I don't do that Christmas tree lighting BS in my builds... At least it's towards the bottom of the card. I can probably tuck it out of the way to hide it.



The video card comes with some sort of weird plastic EK hex wrench. I'm guessing it must be for the G1/4 port caps.

PXL_20211006_221706104~2.jpg




Also not quite sure what all these screws are for. In case you take the block/backplate off and lose some? I'm a fan of extra screws. That shit always gets lost.


Going to have to read the manual...


PXL_20211006_222552374.jpg



Kind of curious why they didn't make it single slot. It looks as if it would have fit...


And it's not like there are any ports in the second slot area or anything. Maybe they feel they need the screws in both slots both slots to keep this heavy thing in place?

PXL_20211006_222019462~2.jpg


Anyway, this thing promises to be a real bad boy. Figured you guys might enjoy some pics.

I'm going to integrate it into my existing build here.
 
Last edited:
Nice card, I just bought their 319 merc 6900XT they had in stock tonight. They must have gotten some stock in and no one bought them out yet. They were out of basically everything else as I checked before I bought it, well at least I finally got my upgrade.
 
The huge port area is interesting, I guess just to put their logo and such on it since the regular EK ones just say radeon there.
 
These cards have great performance/watt and a water block really keeps them at manageable temps for some amazing throughput. Undervolted and oc'ed they can really get the job done at high resolution and I just don't care about RT at this point in time. My 6900 runs my pimax 5K plus at 90Hz with an average frame time of about 6ms so bye bye stutters and dropped frames hope to never see you again lol.
 
That connector is the 5V ARGB LED connector. Most modern motherboards have pins on them to use it. I'm not into Christmas tree/Rainbow Brite LED nonsense either, but occasionally I will set it up to reflect the temperature of the card or something like that (or just turn it off via software).

Nice card...enjoy!
 
Nice find. Once things are in place and sorted out suggest checking out MPT and see what the card can do. Under water you should see some nice gains.

Just saw your other thread. I'd link to MorePowerTool but not sure if that's kosher.
 
I have the MERC319 AMD Radeon RX 6900 XT Ultra and the thing is a beast, not only is it very powerful it is built with a lot of metal. So much that i got a brace over concern of to much pressure being put on the PCIE slot. Your card is half the thickness of mine judging by those pics.

IMG_2555.jpg
 
Alright. I got the thing installed. Took me way longer than anticipated for reasons I will detail in my build thread shortly.

I also updated my BIOS to enable Resizable BAR.

I'm not much of a fan of canned benchmarks, but it seems 3DMarks Timespy is popular, so I did an out of box run in it to gather baseline data for comparison as I start overclocking. This is what it looked like.

timespy_baseline.png


This is all default settings, Resizeable BAR enabled in BIOS, and watercooling set to keep GPU temps below 40C.
Nothing to write home about, but the interesting numbers will come after tweaking.

The CPU numbers seem a little low, but I have read that 3DMark doesn't like SMT on the Threadripper for some reason, so that might be it.

Also, I've heard some praise for AMD's drivers lately. I'm not sure I agree. They seem stable and all from my limited use thus far, but jesus christ is there a crap ton of bloat. I don't want overlays, web browsers and streaming shit as part of my GPU drivers... I wish they had a more basic driver option. I'd definitely install that instead. I really don't like bloat.

They also seem to have fewer rendering settings to choose from than Nvidias drivers do, but maybe there is an advanced mode or something I haven't found yet.
 
Last edited:
Also, I've heard some praise for AMD's drivers lately. I'm not sure I agree. They seem stable and all from my limited use thus far, but jesus christ is there a crap ton of bloat. I don't want overlays, web browsers and streaming shit as part of my GPU drivers... I wish they had a more basic driver option. I'd definitely install that instead. I really don't like bloat.
They do, though. It's in the installer itself, on one of the first pages. You can select what type of install you want between 3 options -> minimal (No control panel at all), basic (no streaming and some other tabs, but sadly no OC panel either), and full-featuerd
(reworded since I don't remember)
 
They do, though. It's in the installer itself, on one of the first pages. You can select what type of install you want between 3 options -> minimal (No control panel at all), basic (no streaming and some other tabs, but sadly no OC panel either), and full-featuerd
(reworded since I don't remember)

Hmm. I missed that option during install.

Either way, I think given the options I'd probably have to stick with the full install either way.

Would be nice if they had an install that had drivers, game profiles and overclocking features only.
 
Not a big deal to me. I only have one DP screen. The others are my side monitors, and those are DVI/VGA only so I have to use adapters either way :p
Those adapters are hit and/or miss for me, I'm pushing three 32" monitors and am not looking to add to headache. Everything is DP now and I have no idea what 2x DP and 1x HDMI would do. That said, I'm seriously considering pulling the trigger on the Asrock 6900xt or this speedster zero as my system has a custom loop. Keep us updated!
 
Those adapters are hit and/or miss for me, I'm pushing three 32" monitors and am not looking to add to headache. Everything is DP now and I have no idea what 2x DP and 1x HDMI would do. That said, I'm seriously considering pulling the trigger on the Asrock 6900xt or this speedster zero as my system has a custom loop. Keep us updated!

I'm thinking its an "all DP except one in case you want to hook up a TV" design philosophy.

But I hear what you are saying. The Pascal Titan I just replaced had 3 DP, 1 HDMI and one DVI for a total of 5 ports. This one only has three.

Maybe they were planning on making it single slot, but in the last minute expanded it to dual slot? Othwerwise it seems like a weird place to skimp.
 
AMD's naming scheme has been really odd this time around. There's: 1) the 6900XT (with an XT chip); 2) the 6900 XTX (which sometimes seems binned and sometimes not?) ; 3) the 6900XTXH; and 4) the supposed 6900XTXH with 18GBps memory.

At least some of those designations work with XFX's name. Not happy about XFX's roughly $500 markup on the already overpriced (based on MSRP) 6900 series. But this does look like the card to go for if you want watercooling.

EDIT to add: I would bet that making it dual slot is both easier from a fitment perspective and especially from a design perspective of managing the card's weight. You're already concerned about it bending your PCIe slot, and that's with two case slots, and screws, taking some of the weight off of the board. Think how much more it could bend or sag with just one case slot and screw.
 
Last edited:
AMD's naming scheme has been really odd this time around. There's: 1) the 6900XT (with an XT chip); 2) the 6900 XTX (which sometimes seems binned and sometimes not?) ; 3) the 6900XTXH; and 4) the supposed 6900XTXH with 18GBps memory.

At least some of those designations work with XFX's name. Not happy about XFX's roughly $500 markup on the already overpriced (based on MSRP) 6900 series. But this does look like the card to go for if you want watercooling.

EDIT to add: I would bet that making it dual slot is both easier from a fitment perspective and especially from a design perspective of managing the card's weight. You're already concerned about it bending your PCIe slot, and that's with two case slots, and screws, taking some of the weight off of the board. Think how much more it could bend or sag with just one case slot and screw.

Yeah, that was my original guess. That they felt they needed it for the extra support so that the slot wouldn't be damaged.

Luckily the inlet and outlet tubing also provide some support.
 
Alright. I got the thing installed. Took me way longer than anticipated for reasons I will detail in my build thread shortly.

I also updated my BIOS to enable Resizable BAR.

I'm not much of a fan of canned benchmarks, but it seems 3DMarks Timespy is popular, so I did an out of box run in it to gather baseline data for comparison as I start overclocking. This is what it looked like.

View attachment 402128

This is all default settings, Resizeable BAR enabled in BIOS, and watercooling set to keep GPU temps below 40C.
Nothing to write home about, but the interesting numbers will come after tweaking.

The CPU numbers seem a little low, but I have read that 3DMark doesn't like SMT on the Threadripper for some reason, so that might be it.

Also, I've heard some praise for AMD's drivers lately. I'm not sure I agree. They seem stable and all from my limited use thus far, but jesus christ is there a crap ton of bloat. I don't want overlays, web browsers and streaming shit as part of my GPU drivers... I wish they had a more basic driver option. I'd definitely install that instead. I really don't like bloat.

They also seem to have fewer rendering settings to choose from than Nvidias drivers do, but maybe there is an advanced mode or something I haven't found yet.
You seem to be off on the GPU score by a good amount, this is my score for reference running the card at stock and default setting, 3dmark.
 
You seem to be off on the GPU score by a good amount, this is my score for reference running the card at stock and default setting, 3dmark.

Hmm. 3DMark interface was telling me I was above average for my combination (3960x w. 6900xt)

I wonder if this is the SMT issue rearing its ugly head.

I may have to test that to make sure.
 
Hmm. 3DMark interface was telling me I was above average for my combination (3960x w. 6900xt)

I wonder if this is the SMT issue rearing its ugly head.

I may have to test that to make sure.
You should be scoring at least the same if not higher then me with that XFX water cooled card.
 
You should be scoring at least the same if not higher then me with that XFX water cooled card.
Your 5900x is Zen 3 Vs. The 3960x Zen 2. Anything which is only lightly threaded, is going to be a sure win for the 5900x. and it will also fairly comfortably maintain a decent, but not large, lead in gaming specific tests.

*that said, the scores probably shouldn't be as different as what is shown in the screenshots, here.
 
You seem to be off on the GPU score by a good amount, this is my score for reference running the card at stock and default setting, 3dmark.
Hmm. 3DMark interface was telling me I was above average for my combination (3960x w. 6900xt)

I wonder if this is the SMT issue rearing its ugly head.

I may have to test that to make sure.
You should be scoring at least the same if not higher then me with that XFX water cooled card.
Your 5900x is Zen 3 Vs. The 3960x Zen 2. Anything which is only lightly threaded, is going to be a sure win for the 5900x. and it will also fairly comfortably maintain a decent, but not large, lead in gaming specific tests.

*that said, the scores probably shouldn't be as different as what is shown in the screenshots, here.

I have a theory as to what may have caused this.

When I started 3DMark today, it prompted me to do SEVERAL updates in a row.

This never popped up last night, and I didn't even think I would need updates, since I installed from Steam and assumed Steam would install the latest version.

I'm testing that theory now, but something is already very different. The fans are spinning much faster this run than they did last night...

Edit: Nope, that wasn't it. Same results as last night.

Going to try to disable SMT and see if that does anything.
 
I'm thinking its an "all DP except one in case you want to hook up a TV" design philosophy.

But I hear what you are saying. The Pascal Titan I just replaced had 3 DP, 1 HDMI and one DVI for a total of 5 ports. This one only has three.

Maybe they were planning on making it single slot, but in the last minute expanded it to dual slot? Othwerwise it seems like a weird place to skimp.
Out of curiosity did you have issues with the adapters?

This thing is still in stock and I cannot find any clear info if running 2 monitors on DP and 1 on HDMI is going to cause me any issues with eyefinity. Running 7680x1440 really grows on ya... lol.

In normal times I would roll the bones and if this did not work I'd just grab a NV card.
 
I have a theory as to what may have caused this.

When I started 3DMark today, it prompted me to do SEVERAL updates in a row.

This never popped up last night, and I didn't even think I would need updates, since I installed from Steam and assumed Steam would install the latest version.

I'm testing that theory now, but something is already very different. The fans are spinning much faster this run than they did last night...

Edit: Nope, that wasn't it. Same results as last night.

Going to try to disable SMT and see if that does anything.

Well that sure made a difference!

Here is all stock settings under water, with SMT disabled.

timespy_no_SMT.png


A quick review of the overclock thread suggests this is beating some folks overclocked results...

I guess the SMT issue is REAL.
 
I have a theory as to what may have caused this.

When I started 3DMark today, it prompted me to do SEVERAL updates in a row.

This never popped up last night, and I didn't even think I would need updates, since I installed from Steam and assumed Steam would install the latest version.

I'm testing that theory now, but something is already very different. The fans are spinning much faster this run than they did last night...

Edit: Nope, that wasn't it. Same results as last night.

Going to try to disable SMT and see if that does anything.
Disabling SMT doesn't always benefit games.

**haha, looks like 3D mark is a situation where it does benefit!
 
Out of curiosity did you have issues with the adapters?

This thing is still in stock and I cannot find any clear info if running 2 monitors on DP and 1 on HDMI is going to cause me any issues with eyefinity. Running 7680x1440 really grows on ya... lol.

In normal times I would roll the bones and if this did not work I'd just grab a NV card.

So, I don't use Eyefinity.

I have an old PLP setup, the screens on the side are just side screens, not for gaming. And they are pretty old, running at 1600x1200 over DVI.

But with the random old adapters I fished out of my junk cable drawer, both (one HDMI to DVI, and one DP to DVI) work perfectly.
 
Disabling SMT doesn't always benefit games.

**haha, looks like 3D mark is a situation where it does benefit!

Yeah, so that's the thing.

Now that I have figured out that it makes an ENORMOUS difference in TimeSpy, should I keep it off all the time, just in case it benefits games as well? I'll have to think about that. My Windows is dedicated to gaming. I use Linux for my day to day.

I vaguely remember reading that my motherboard supports being used as a boot selector, and can have different profiles for different boots. Maybe I can just switch it from Grub to using the BIOS boot menu, and setting the profile to disable SMT for windows only...

Decisions.
 
Also,

The GPU is running warmer than expected.

My old Pascal Titan X would stay at 37-38C at the settings I am running now. This beast goes up and hovers around 50C to 51C.

This is partially to be expected, the Pascal Titan was a 250W TDP card. This thing is at least a 300W design, maybe more. (the XFX website specs do not say how much over reference TDP it is...)

That said, I wonder if they did a shitty job applying paste at the factory. I *sigh* may have to take it out, and take the water block off and examine it...
 
Yeah, so that's the thing.

Now that I have figured out that it makes an ENORMOUS difference in TimeSpy, should I keep it off all the time, just in case it benefits games as well? I'll have to think about that. My Windows is dedicated to gaming. I use Linux for my day to day.

I vaguely remember reading that my motherboard supports being used as a boot selector, and can have different profiles for different boots. Maybe I can just switch it from Grub to using the BIOS boot menu, and setting the profile to disable SMT for windows only...

Decisions.
Gamer's Nexus did some Zen 2 tests with SMT disabled. They weren't' threadrippers. But it wasn't predictable which games would benefit.


I think you will either need to test game by game. or just be happy with choosing one or the other.
 
So, I did another run while looking at the metrics in the Radeon software.

Based on the Radeon software's measured wattage during a Time Spy run, the out of box Power Limit seems set to 335W. That's the max power used I saw throughout the benchmark run, and it was fairly consistent, rarely dipping below 330.

I moved the power slider all the way to the right to add my 15% and re-ran the benchmark. As would be mathematically expected, this resulted in max power used at 385W, though not as consistent, it bounced up and down more, which suggests that with the 335W limit it was hitting the limit a lot, but at 385W it is sometimes, but not other times.

Increasing the power (but touching nothing else) yielded slightly higher numbers:

timespy_no_SMT_max_power.png


It also resulted in a 3C increase in core temps to a max of 54C.

I do want to continue tweaking, but I'm not 100% sure what my next best bet is. Maybe go for a higher core clock, while also trying to reduce voltage a little? And then getting MorePowerTool to override the power limit maybe? I'll have to think about it.
 
So, I did another run while looking at the metrics in the Radeon software.

Based on the Radeon software's measured wattage during a Time Spy run, the out of box Power Limit seems set to 335W. That's the max power used I saw throughout the benchmark run, and it was fairly consistent, rarely dipping below 330.

I moved the power slider all the way to the right to add my 15% and re-ran the benchmark. As would be mathematically expected, this resulted in max power used at 385W, though not as consistent, it bounced up and down more, which suggests that with the 335W limit it was hitting the limit a lot, but at 385W it is sometimes, but not other times.

Increasing the power (but touching nothing else) yielded slightly higher numbers:

View attachment 402386

It also resulted in a 3C increase in core temps to a max of 54C.

I do want to continue tweaking, but I'm not 100% sure what my next best bet is. Maybe go for a higher core clock, while also trying to reduce voltage a little? And then getting MorePowerTool to override the power limit maybe? I'll have to think about it.
That is a really awesome score!
 
So, I did another run while looking at the metrics in the Radeon software.

Based on the Radeon software's measured wattage during a Time Spy run, the out of box Power Limit seems set to 335W. That's the max power used I saw throughout the benchmark run, and it was fairly consistent, rarely dipping below 330.

I moved the power slider all the way to the right to add my 15% and re-ran the benchmark. As would be mathematically expected, this resulted in max power used at 385W, though not as consistent, it bounced up and down more, which suggests that with the 335W limit it was hitting the limit a lot, but at 385W it is sometimes, but not other times.

Increasing the power (but touching nothing else) yielded slightly higher numbers:

View attachment 402386

It also resulted in a 3C increase in core temps to a max of 54C.

I do want to continue tweaking, but I'm not 100% sure what my next best bet is. Maybe go for a higher core clock, while also trying to reduce voltage a little? And then getting MorePowerTool to override the power limit maybe? I'll have to think about it.

Increase memory speed. You can most likely max it out at the 2150 and have no issues.

Also your pascal likely didn’t report junction temps (I’m assuming your 50-60s temps are junction?) for reference my 6900xt on water hits 55-60c junction temps at stock clocks.
 
Increase memory speed. You can most likely max it out at the 2150 and have no issues.

Also your pascal likely didn’t report junction temps (I’m assuming your 50-60s temps are junction?) for reference my 6900xt on water hits 55-60c junction temps at stock clocks.

I guess they have been listening to peoples complaints, because the slider on this one goes all the way up to 3000.

I doubt I'll be able to hit that, but I'm going to see how far I can go.
 
Increase memory speed. You can most likely max it out at the 2150 and have no issues.

Also your pascal likely didn’t report junction temps (I’m assuming your 50-60s temps are junction?) for reference my 6900xt on water hits 55-60c junction temps at stock clocks.
I don't think nvidia started reporting junction till rtx series.

Random related but not related, what kind of scores do you get on your reference card overclocked (under water)?
 
I guess they have been listening to peoples complaints, because the slider on this one goes all the way up to 3000.

I doubt I'll be able to hit that, but I'm going to see how far I can go.

Well, the slider may go to 3000, but that may not be practically significant.

2100 was excellent, but 2200 had some mild artifacting and a dramatically lower score. 2175 had no artifacting, but score was still much reduced. Testing 2150 now.
 
Did some more playing around with settings, but admittedly I don't really know what I'm doing. Here is the best I came up with:

Min Freq: 2450
Max Freq: 2650
Voltage: 1130 (anything lower would crash the driver or 3DMark or both)
Ram: 2125
Power Limit: +15% (which results in 385w)

Unlike other boards, the RAM slider on this one goes up to 3000. Doesn't help much through. At 2200 I have stuttering and some mild artifacts and a SIGNIFICANLY reduced score. At 2175 the stutter and artifacts are gone, but the score is still very bad. 2150 scores very slightly below 2100. The sweet spot for me seems to be 2125 on the RAM.

Not sure if the score losses are due to the RAM not performing well due to being pushed too hard, or if it is taking power away from the core due tot he power limit.

Here is my best score thus far:

clock_2650_Volt_1130_Mem_2125.png


I'm obviously happy with this, but it would be fun to push the graphics score into the 23000's

That's all I have time for tonight. I'm considering playing around with MorePowerTool, but since I'm already at 385w, I'm not sure how much higher it is wise to go. I still have plenty of thermal headroom though. Core temp is 52C in my latest run, with Junction Temp at around 67C.

Again, I'd appreciate any suggestions.
 
I don't think nvidia started reporting junction till rtx series.

Random related but not related, what kind of scores do you get on your reference card overclocked (under water)?
6900xt timespy.png


That's 2475 min, 2575 max, 2150 memory.

Just ran firestrike and got 39.9k overall, 59.8k graphics.

Looks like the xfx above performs strongly!
 
Last edited:
View attachment 402454

That's 2475 min, 2575 max, 2150 memory.

Just ran firestrike and got 39.9k overall, 59.8k graphics.

Looks like the xfx above performs strongly!

You might want to downclock your memory a bit. I'll show a screenshot when I get home, but I'm pretty sure I'm close to that on the reference cooler. Except my clocks lower.
 
Back
Top