XFX Speedster Zero WB Radeon RX 6900 XT

You might want to downclock your memory a bit. I'll show a screenshot when I get home, but I'm pretty sure I'm close to that on the reference cooler. Except my clocks lower.

I can play with it a bit later, from what I remember score decreases with lower memory clocks.
 
Alright, I think I've reached the limit for this one, at least without going to extreme measures.

Once I loaded up MorePowerTool I found that the stock power limit was 332W (odd choice) and since I had already tested +15%, that meant I had already tested up to 382W, so I just set 382 as the new base power limit (giving me the option to go +15% over that in the drivers.

Best results I've been able to get thus far are at:

MinFreq: 2575 Mhz
MaxFreq: 2725 Mhz
Volt: 1140mv
VRAM: 2126Mhz
Power Limit: +15% (439W)


clock_2725_Volt_1140_Mem_2126.png

link


Not too shabby if I may say so myself.

I kept an eye on the power draw and it actually did draw all the way up to 437w at one point! No wonder this room is getting warm.

I tried going up to a MaxFreq of 2750 but that resulted in crashes. I stepped the voltage all the way up to 1200mv but that didn't help, and I didn't really want to go above that.

So I think I've had my fill of canned benchmarks for a while. Now I need to find myself a game to play :p
 
  • Like
Reactions: noko
like this
That appears to be the all time high for that score, congrats!

There have definitely been higher results under extreme cooling situations, but those were with different CPU's.

This is the highest on record for this combination (3960x with 6900xt) which is pretty neat.

Overall graphics score, it's probably around top 100 somewhere, which still isn't too shabby considering that top 100 includes 3090's and the LN2 extreme crowd.
 
  • Like
Reactions: noko
like this
Since I missed out on today's AMD card drop. Is it worth the premium to buy one of these pre-waterblocked? Doing so would be my second priciest GPU purchase, but the last time was three distinct GPU's and three waterblocks.
 
Since I missed out on today's AMD card drop. Is it worth the premium to buy one of these pre-waterblocked? Doing so would be my second priciest GPU purchase, but the last time was three distinct GPU's and three waterblocks.

Well, I'm still working on optimizing gaming performance, but it looks like it will be promising. For what it is worth, my stock TimeSpy run with this board beat the best overclocked 6900XT results over in the 6800/6900 Overclock Thread, and it only got better when I overclocked it.
 
  • Like
Reactions: noko
like this
Well, I'm still working on optimizing gaming performance, but it looks like it will be promising. For what it is worth, my stock TimeSpy run with this board beat the best overclocked 6900XT results over in the 6800/6900 Overclock Thread, and it only got better when I overclocked it.
You have one very fast card, at times limited by CPU configuration for certain loads. Maybe we will get lucky and TR Zen 3 will be very good and a worthy upgrade at a reasonable price. Now if they put VCache on one or more dies for very fast gaming, that would be even better.
 
I just bought one a couple hours ago and will hold the original poster in this thread personally responsible for the card's performance.

Going to go through with some long deferred maintenance and deep cleaning when I break down the loop to add this thing in.
 
I just bought one a couple hours ago and will hold the original poster in this thread personally responsible for the card's performance.

Going to go through with some long deferred maintenance and deep cleaning when I break down the loop to add this thing in.

I'll keep my fingers crossed for you, but my experience is based on an n=1 type of analysis. No guarantees :p

Silicon lottery applies as always...

That said, I do think a big part of it is because of the more highly binned Navi 21 XTXH chip and the totally reworked VRM/Power stage, so I wouldn't be surprised if these cards are all quite a bit more overclockable than the reference ones.
 
Alright, so I finally got around to taking the cooler off this thing. Anyone ready for some nudez?

I simply disconnected my QDC's and did this with fluid still in the block. A little non-conventional, but I wasn't planning on opening the block itself, so I figured it was fine. It did cause a few difficulties in getting a flat surface to work on, but I made it work.

Also, I don't usually use antistatic mats and wristbands, but I've learned more about ESD over time, and my old argument of "well in my 30 years of doing this I've never killed anything" doesn't really mean anything. ESD is not all or nothing. You can zap something by toughin it just enough to weaken it, and have it fail as a result years later. This is also my most expensive component to date, so I figured better safe than sorry.

PXL_20211024_025215609.jpg



PXL_20211024_025147383.jpg


So, first we have to take the massive backplate off. Seven screws is all it takes, one of them covered in one of those "Warranty void if removed stickers"

1635113227890.png


Nice try XFX, we all know that is illegal by now.

Side note, no idea why the PCIe contacts are so scratched. I can't think of anything I might have done that would have done that. I'm guessing it must have happened at the plant?

Either way, they work just fine, so I am not concerned.

7 screws removed and some gentle prying with my fingers later and the backplate is off:

PXL_20211024_025626812.jpg

PXL_20211024_025633333.jpg


Some pretty decent putty style thermal pads covering all the hot components on the back. I'd call this an A+ on XFX's part. Doesn't really get much better than this from what I have seen.

Now we've revealed the ~16 (I think?) remaining screws that need to be removed to get the block off. In order to remember which screw holes were used by the backplate so I didn't accidentally populate them during reassembly, I highlighted them in the next pic.


PXL_20211024_030031383~2.jpg



Now, for the moment we've all been waiting for. The full frontal nudez.

PXL_20211024_030914757.jpg


My two comments are as follows:

1.) GPU looks well pasted. Maybe a little excess, but not too bad. I probably wasted my time in taking this off, but at least now I know. I've seen some horror stories so I had to make sure, especially since it was running a bit warmer than I was used to with my old Pascal Titan X. Maybe that's just a 6900xt thing, or especially THIS 6900xt. It does pull a lot more power than my old Pascal Titan X did.

2.) Those VRM's DO NOT look like they are touching the thermal pads in the grooves on the block. There were no real indentations on the thermal putty/pads except for like two of them in the corner.

PXL_20211024_030920505.jpg


PXL_20211024_030925112.jpg



PXL_20211024_030937345.jpg
PXL_20211024_030942293.jpg



Alright, so I wiped down the GPU area of the block and the GPU itself with isopropyl alcohol, and reapplied Thermal Grizzly Kryonaut using the "rub it on using a nitrile glove" method. Thermal Grizzly Kryonaut is pretty thick when cold, so I heated the tube with a hair dryer before starting to make it easier.

I also rubbed some thermal grizzly on top of the VRM's to see if I could make them contact the thermal pads. A little unorthodox I know, but I figured it was better than nothing. (I didn't have any good thermal pads to replace them with.)


The result?

Despite the paste job looking pretty good as it was, the GPU does run a lot cooler now.

At first I didn't think it was. My first run through the temps were as high as they were previously, but I guess the Thermal Grizzly Kryonaut just needed to warm up and spread out a bit, because in the second run and beyond the temps settled down nicely.

Previously at max overclock in TimeSpy I was hitting 56C core, 71C Junction, now that is down to 46C core, 60C Junction.

At stock settings it was hitting ~51-52 core (can't remember junction) so this is cooler even than stock settings.

I have had one weird side effect through. I've had to back off about 50-75Mhz on the main clocks to remain stable. I'm not quite sure why. Temps are obviously better, and I don't think I damaged anything, because these things are either dead or not if you damage them, not just slightly reduced max overclock.

My best guess is that AMD's automagic logic is trying to reduce the voltage due to the lower temps, but doing so too much resulting in lack of stability.

I am going to have to tinker with the clocks and voltages to see if I can get back up to where I was.
 
Nice job, as a note I rub it on the GPU to ensure full coverage and then add a small pea size in the middle to push out any air pockets (my theory). This method gave me very consistent temperatures from one repaste to the next. Seems like a hundred ways or more people do this.
 
Your 5900x is Zen 3 Vs. The 3960x Zen 2. Anything which is only lightly threaded, is going to be a sure win for the 5900x. and it will also fairly comfortably maintain a decent, but not large, lead in gaming specific tests.

*that said, the scores probably shouldn't be as different as what is shown in the screenshots, here.
I had a 2700x with my 6900xt and got >20,000 graphics score then upgraded to 5900x and maybe got 300 more graphics points. The CPU is not the problem.
 
You might want to downclock your memory a bit. I'll show a screenshot when I get home, but I'm pretty sure I'm close to that on the reference cooler. Except my clocks lower.

Yeah, on mine, performance peaked at 2026 Mhz. At 2030 performance was lower. There is some board to board variation, but don't just assume "higher is better". Sometimes it isn't, even when it is stable. Especially with RAM.
 
I forgot to update this thread.

Despite the better temps, I was only able to squeeze a tiny but more out of the GPU:

23411.PNG


Link if you are into that sort of thing.

I might be able to step up the core clock by single digits, but I don't think I'll be able to squeeze much more out of it than this.

At least not without extreme cooling.

(I have been considering getting an aquarium chiller and running slightly subambient, but above dew point. Not sure if worth it.)
 
H
So,

I read about this video card over at TheFPSReview.

I hadn't planned on going AMD this generation, but through some miracle of fortune this one was in stock at AIB MSRP of $1,799 over at Newegg, so after a few seconds of hesitation I bit.

I had really wanted to go with a 3090 or 3080ti this time around, but I wasn't willing to pay $3 grand for the 3090, and 3080ti availability at least was a bit difficult.

It comes with the specialty binned XTX version of the 6900xt, a totally reworked VRM/power stage to optimize for water cooling, and three 8 pin ports, so hopefully it will make for a good overclocker. Buzz about the card suggests 3GHZ is possible. I figured if this thing can hit 3Ghz as has been suggested, at least if you don't factor in raytracing it might even beat a 3090.

So I bit.

And for some odd reason it is still in stock. Maybe there just aren't enough custom loop builders among the miner crowd to make this fly off the shelves like air cooled models.

It arrived today in a much larger box than expected, and I figured I'd share some details.

View attachment 401014

This thing is surprisingly heavy. Much more so than I remember my Pascal Titan X with block installed being....


(Though I haven't done a side by side as it is still going to be installed until I'm done cleaning the new one.

View attachment 401016


The back plate is metal and like 1/8 thick throughout. It's like the damn thing is armored. I'm guessing that's where most of the excessive weight comes from.

View attachment 401027

View attachment 401028

Here's hoping the motherboard PCIe slot can handle the weight...

I cannot understate how surprisingly heavy this thing is.

Normally I like that, it makes it feel sturdy and of quality construction, but this may be a little bit much, even for me...



Sadly the RGB LED cable is not detachable from the block, so you are stuck with it whether or not you want it...

View attachment 401021

(Unless... You know... <snip> <snip>. But that probably impacts warranty and resale value. I guess I'll have to try to hide it somehow.)


I'm so disconnected from the PC case Christmas tree lighting world that I don't even know what kind of connector this is:

View attachment 401022


I don't do that Christmas tree lighting BS in my builds... At least it's towards the bottom of the card. I can probably tuck it out of the way to hide it.



The video card comes with some sort of weird plastic EK hex wrench. I'm guessing it must be for the G1/4 port caps.

View attachment 401023



Also not quite sure what all these screws are for. In case you take the block/backplate off and lose some? I'm a fan of extra screws. That shit always gets lost.


Going to have to read the manual...


View attachment 401024


Kind of curious why they didn't make it single slot. It looks as if it would have fit...


And it's not like there are any ports in the second slot area or anything. Maybe they feel they need the screws in both slots both slots to keep this heavy thing in place?

View attachment 401025

Anyway, this thing promises to be a real bad boy. Figured you guys might enjoy some pics.

I'm going to integrate it into my existing build here.
Hi together,

I got today my XFX 6900XT Zero WB.
Is it normal that the GPU bracket is wired bend?
 

Attachments

  • IMAG0834.jpg
    IMAG0834.jpg
    299.7 KB · Views: 0
  • IMAG0835.jpg
    IMAG0835.jpg
    305.8 KB · Views: 0
H

Hi together,

I got today my XFX 6900XT Zero WB.
Is it normal that the GPU bracket is wired bend?
It's tough to tell from your picture, but that doesn't look quite right to me.

In normal times I would probably have returned it, but with GPU supply being difficult, I might not.

This seems relatively low risk, as that is only the stamped PCIe bracket. I wonder if the screw hole on the bottom of the bracket was drilled in the wrong position, resulting in this.

I'd check if it fits in the slot, and if it works properly. If it does, then given the current GPU situation, I'd probably live with it.

If it doesn't fit, then I'd try losening the scrrews that hold the bracket to the rest of the board, and see if they can be re-tightened with the bracket in a straighter position.

Of course, if the GPU doesn't fire up, then there is some actual damage, in which case you have no choice but to return it.
 
It's tough to tell from your picture, but that doesn't look quite right to me.

In normal times I would probably have returned it, but with GPU supply being difficult, I might not.

This seems relatively low risk, as that is only the stamped PCIe bracket. I wonder if the screw hole on the bottom of the bracket was drilled in the wrong position, resulting in this.

I'd check if it fits in the slot, and if it works properly. If it does, then given the current GPU situation, I'd probably live with it.

If it doesn't fit, then I'd try losening the scrrews that hold the bracket to the rest of the board, and see if they can be re-tightened with the bracket in a straighter position.

Of course, if the GPU doesn't fire up, then there is some actual damage, in which case you have no choice but to return it.
Thank you for your answer! My problem is my PC will take 3 or 4 weeks until it's ready. Only my external water cooling setup waits to crack the 3Ghz :)

I will write the shop to exchange the card.

Br
 

Attachments

  • DSC_0778.jpg
    DSC_0778.jpg
    210.2 KB · Views: 0
Thank you for your answer! My problem is my PC will take 3 or 4 weeks until it's ready. Only my external water cooling setup waits to crack the 3Ghz :)

I will write the shop to exchange the card.

Br

Just keep your expectations in check!

You are unlikely to hit 3Ghz without extreme subambient cooling, and even then, it is likely a stretch.
 
bracket is screwed on crooked, just loosen the screw and straighten it. wont hurt warranty or anything. and yeah dont count on 3Ghz...
 
Turn SMT off for your Threadripper and then clock it as high as you can. That will improve your overall CPU score.

Also, keep your min/max for clocks within 100mhz of each other and crank power, Vram and mV to max. Yours should be 1275mv yea? A zero should be able to crank out 2950mhz. How good is your cooling? What temp is your coolant? Rads?

What RAM do you have?
 
Hmm.

One thing I've noticed with this card is that every time I install new drivers it becomes unstable again, and I have to find new optimal settings.

Maybe this is an AMD thing?

With my Titan X I never had to change anything when I installed new drivers. My old max stable settings always just worked.

Has anyone else had this issue?
 
I've had some subtle variances from driver changes on my vanilla card but I'm not using MPT or running at max clocks as I need this this thing to last a good long time for the chip excuses(shortage) to end.
 
Back
Top