Looking for feedback: 6900xt vs 3080

mrjayviper

Weaksauce
Joined
Jul 17, 2012
Messages
91
Some info:
  • the 6900xt is the aorus waterforce edition (seems one of the highest clocked for this series). Looking at the clocks and comparing it against the Asrock Formula OC and Powercolor Red Devil Ultimate, this Gigabyte board is most likely an XTXH
  • the 3080 is from Zotac and it seems it's one of the regular model (nothing fancy)
  • both are on sale and same price. Best price nationally for their respective series
  • 6800xt and 3070ti (not on sale) is the same price at the 2 cards
  • The 3080ti is even more expensive than the 6900XT
  • I have no intention of mining on my GPU
I've read various reviews and I'm leaning towards the 6900xt but these reviews used older drivers which can make a huge difference.
Attach files
Thanks for the advice.
 
What games do you play?
I don't play FPS games (or recent AAA games) only because my 980ti can't play them in ultra/max settings.

I do play Total War Warhammer2 which can look ok/bearable in medium settings. but ultra settings and it'll be quite slow. I'm guessing Warhammer3 will make it even slower.
 
What games do you want to play? Besides the 2 above. What cpu do you have?

Looking at total war hammer from some recent reviews, it looks to be ok on either gpu.
 
Last edited:
Are you really comparing a card that's faster than 3090 to 3080?
Doesn't matter what game you play, the 6900XTXH will be faster.
That's 100 percent not true. Especially considering I played on these cards - 6800, 6900xt, rtx 3080, rtx 3070, rtx 3070ti, rtx 3060ti.
 
The bigger question at play here is what resolution do you play at. If you are on 1080p then either card will play all games with all settings cranked at that resolution. The 6900xt will lag behind in performance when using raytracing in most instances due to the fact that it is a first gen Raytracing card.

The higher the resolution you want to push the more you will see the gaps in the cards and the 3080 will gain a slight edge when using raytracing features.

What CPU are you using?

My best suggestion here is if you have to purchase now then I would suggest whichever one you can purchase for a lower price that will suit your needs. But my vote goes to the 6900xt due to the cost per performance per watt...

I am weighing this same purchase. From what I can tell though I am probably going to sit out this cycle as my 2080ti still holds its own. If I had to make a purchase right now though it would be the 6900xt. For now though I am going to see what the next round brings.

**Edit - Changed format of the post so it was more clear. P.S. Check out this Comparison Video
 
Last edited:
That's 100 percent not true. Especially considering I played on these cards - 6800, 6900xt, rtx 3080, rtx 3070, rtx 3070ti, rtx 3060ti.

6900XT =/= 6900XTXH

Raytracing is lackluster this gen. Maybe in a couple of years. MAYBE.
Remember Physx?
 
If the games you play/want to play support raytracing or dlss the 3080 is likely the better option.

If the games you play/want to play are not rt or dlss go AMD.
 
That's 100 percent not true. Especially considering I played on these cards - 6800, 6900xt, rtx 3080, rtx 3070, rtx 3070ti, rtx 3060ti.
...and I'm sure you documented your findings with exactitude. :yawn:
 
...and I'm sure you documented your findings with exactitude. :yawn:

Do you need some receipts or something? (Also in a few games, I have enough exactitude for my own testing)

It'll heavily depend on the games that a person plays. So much so, that I pick the card for the games I plan on playing and currently play. I unfortunately really like fallout 76. I also like games that tended to have heavy raytracing. Sometimes my rx 6900xt is a dream, sometimes it's a potato, and other times it's an rtx 2080ti/rtx 3070 but without dlss.


If I tended to play games like Assassins creed, I would without a doubt buy an rx 6xxx card.

I currently have an rx 6900xt and am trying to get an rtx 3080. So, it is, what it is.

edit

One other thing, if he has a specific game that doesn't have any review benchmark reviews and I own it, I'd be glad to test it for him on my 6900xt.
 
Last edited:
This is a difficult one. RTX 3080 only has 10GB VRAM, and it pains me to say this, but there are starting to be edge cases where this isn't enough VRAM at 4K. The 6900 XT has plenty of memory, but it's slower memory and seems to struggle at 4K at some points due to this. Also, Ray Tracing is lack luster on AMD's offerings this generation. Also, as much as people want to believe that AMD Super Resolution is the same as DLSS, it isn't. DLSS in some cases appears to improve image quality, while Super Resolution never improves image quality.

It's all down to what games you play. If your games uses DLSS and/or Ray Tracing, the RTX 3080 will be the clear winner. If not, then the 6900 XT will "probably" be the better choice.
 
6900XT my vote over the 3080 mostly because of the ram.

Been playing Farcry 6 with raytracing at 4K with HDR on the 6900XT -> ABSOLUTLY beautiful! Ram usage is almost up to 16gb. On my 3080Ti, not all the high res textures are loaded, so when you go up close to some of the objects they are low res blurry mess! Vram basically pegged, 12gb. For true 4K games with assets taking advantage of the pixel density -> much ram is needed. Edge case? Maybe now but that quality will be pushed more and more as time goes on.

As for ray tracing, only a few games I've found where it makes a good difference and is desirable to have on. 99.999% or there abouts of PC games have no ray tracing, only the newest latest games with some benefits are available. I would say Nvidia DLSS is more important or useful then raytracing but does compliment raytracing nicely.

As for AMD raytracing, if the game uses DXR 1.1, designed around AMD hardware (to use the infinity cache effectively, as in multiple shaders/compute operations in a single shader) it does well, much better than what we saw in Turing.

forza-horizon-5-2160p-extreme.png
https://www.kitguru.net/components/...on-5-pc-performance-benchmark-30-gpus-tested/
 
Last edited:
I have the Zotac 3080 Holo (got it at retail for 1100) and love it. I went to NVIDIA from AMD due to the insane pricing and lack of AMD cards on their webstore. At least BB has the 30XX FE versions.
I can set any title at ultra and game away. My resolution is 1440p at 144 Hz. NVIDIA has flawless free sync compatibility with my LG.
RT is nice but not implemented in all games, DLSS is gaining ground, plus NVIDIA will allow you to use AMD's open source stuff and fidelity FX.
 
Last edited:
I've had both a 3080 and a 6900XT. I have a 32" 1440p 144Hz Freesync monitor. I don't play online competitive games. I played through AC: Valhalla on a mixture of both and I couldn't really tell the difference. Or more accurately, neither suffered from any noticeable problems that took away from the gaming experience.
 
3080 all the way. I gave up my 3070 when I picked up a 6900XT from the forum here and it was a bad choice. The card is great at high end games, but play anything a little older like revisiting titles you played in the past and it turns itself inside out trying to downclock to save power. Some old games I like to play are completely unplayable at all, the card will drop down to like 80mhz and you get 9fps in a game from 2011 that runs fine on intel integrated. Also, the AMD drivers are just plain weird, like if I turn my monitor go to sleep and then wake it up all the HDMI audio devices reinstall themselves and cause all sorts of havok. All of these things worked perfectly fine on the 3070, and in some games its performance was leagues above simply because AMD seems incapable of determining a game is actually running. Don't even try to come at me with the DDU crap, I reinstalled windows from scratch when I switched cards.

TLDR - unless you play benchmarks take the 3080
 
3080 all the way. I gave up my 3070 when I picked up a 6900XT from the forum here and it was a bad choice. The card is great at high end games, but play anything a little older like revisiting titles you played in the past and it turns itself inside out trying to downclock to save power. Some old games I like to play are completely unplayable at all, the card will drop down to like 80mhz and you get 9fps in a game from 2011 that runs fine on intel integrated. Also, the AMD drivers are just plain weird, like if I turn my monitor go to sleep and then wake it up all the HDMI audio devices reinstall themselves and cause all sorts of havok. All of these things worked perfectly fine on the 3070, and in some games its performance was leagues above simply because AMD seems incapable of determining a game is actually running. Don't even try to come at me with the DDU crap, I reinstalled windows from scratch when I switched cards.

TLDR - unless you play benchmarks take the 3080

This is what happened with many of my older games. Fallout 76 gets about 60 fps on my rx 6900xt, my previous rtx 3070ti got 90 fps in the same spot. Sure the rx 6900xt uses half the amount of power, but who cares... in this case.

If it's tripple a, it's usually okish.

I mean hell at 1080 my mobile gtx 1660 ti beats my 6900xt... in fallout 76. That's strait up a driver cpu/gpu drive utilization issue.
 
This is what happened with many of my older games. Fallout 76 gets about 60 fps on my rx 6900xt, my previous rtx 3070ti got 90 fps in the same spot. Sure the rx 6900xt uses half the amount of power, but who cares... in this case.

If it's tripple a, it's usually okish.

I mean hell at 1080 my mobile gtx 1660 ti beats my 6900xt... in fallout 76. That's strait up a driver cpu/gpu drive utilization issue.

I actually mostly quit playing Fallout 76 after I switched to this card. The performance at 1440p was abysmal. Another old game I like to play is Armored Warfare, but its on Cryengine and the frame rate is a slide show on this card while on the 3070 it was like 140+fps. Hell, my RX560 gets better frame rates than my 6900XT in that game. But stuff like Cyberpunk 2077 theres no contest, the 6900XT is so much faster when its clock speed actually comes up.
 
3080 all the way. I gave up my 3070 when I picked up a 6900XT from the forum here and it was a bad choice. The card is great at high end games, but play anything a little older like revisiting titles you played in the past and it turns itself inside out trying to downclock to save power. Some old games I like to play are completely unplayable at all, the card will drop down to like 80mhz and you get 9fps in a game from 2011 that runs fine on intel integrated. Also, the AMD drivers are just plain weird, like if I turn my monitor go to sleep and then wake it up all the HDMI audio devices reinstall themselves and cause all sorts of havok. All of these things worked perfectly fine on the 3070, and in some games its performance was leagues above simply because AMD seems incapable of determining a game is actually running. Don't even try to come at me with the DDU crap, I reinstalled windows from scratch when I switched cards.

TLDR - unless you play benchmarks take the 3080
Sounds like a really easy fix using Afterburner or even AMD software to adjust the clock in lower power states.
 
Sounds like a really easy fix using Afterburner or even AMD software to adjust the clock in lower power states.

Doesn't entirely work that way. Even if the clocks stay up, in some games it helps and others it still a potato. I went as far as running 4k vrs and it still didn't help in the few titles I tried it under (up the gpu load). Also tried to min limit the gpu and in fallout 76 it made no difference. I have heard of it working in other titles. Ultimately, that's hit and miss from what I can tell.
 
Doesn't entirely work that way. Even if the clocks stay up, in some games it helps and others it still a potato. I went as far as running 4k vrs and it still didn't help in the few titles I tried it under (up the gpu load). Also tried to min limit the gpu and in fallout 76 it made no difference. I have heard of it working in other titles. Ultimately, that's hit and miss from what I can tell.
I have not seen this, other options are to either turn on vertical sync and force the frame rate or use chill which is a better option. If you have a 144hz monitor, use chill to control between 140fps-144fps as an example.

If you would, list the games you have problems with, to see if we can duplicate, also check solutions.
 
Sounds like a really easy fix using Afterburner or even AMD software to adjust the clock in lower power states.

This is a well known problem, its been around since Vega launched. These are just the latest and greatest to be affected. If you set the clock speed manually the usage just goes down and then the vcore and memory clocks won't match up. Then you end up with an even more stuttery mess. Lets take a new game for example: Amazons New World. Despite how much of a trash game it is, its still a good example. My 6900XT get around 45-55fps in this game maxed out at 1440p and its the exact same if I upscale it to 5k, and it microstutters and lags constantly. My 3070 on the other hand had a nice clean 120+fps no problem and ran smooth as butter.

Again, this is just games that don't force the card to run. Stuff like Warframe, Dragon Age, Mass effect, Mechwarrior 5, etc that I like to play all become like this. But heavy games like Cyberpunk or Battlefield run extremely well. I've spent hours and hours tweaking in the radeon software trying different combinations of the things to get it to run but in the end I just had to give up some games. The downclocking is so aggressive that it even causes problems with video playback sometimes, like youtube will start to skip frames when the card reaches 20mhz idle speed. AMD just went way way way too far trying to make it have a power efficiency similar to that of Nvidia.

AMD's own forums are filled with the same complaint - card stuck at low clock speed in XYZ game. Everything from the original Vega 56/64 to the newest 6000 series cards has had the same issues. I'm betting that the reason we don't see more complaints is that a lot of people still run 1080p/60hz with vsync on and don't really notice it that much.
 
I'lll grab a few older games, it'll take a bit though.

A I'm lazy and don't want to download and install stutterfest
b my internet is slow :confused:

Like I said though, it highly and I mean highly just depends on the game. Some older games are fine, others like the above poster says sttutttteeerrrffeeesssttt.
 
Last edited:
I'lll grab a few older games, it'll take a bit though.

A I'm lazy and don't want to download and install stutterfest
b my internet is slow :confused:

Like I said though, it highly and I mean highly just depends on the game. Some older games are fine, others like the above poster says sttutttteeerrrffeeesssttt.

Yeah, some are ok others just don't wanna run. Cryengine games are the worst as far as I know and this arch has had a rough time with that engine for a long time. Like I said, top tier AAA games run so smooth, its great for those. I play a lot of Cyberpunk and it runs like 30-40% faster than my 3070 did (RTX off). Absolutely great.

However, knowing what I know now, if I had this and a 3080 in front of me I would take the 3080 with no hesitation just for its ability to cover more titles.
 
This is a well known problem, its been around since Vega launched. These are just the latest and greatest to be affected. If you set the clock speed manually the usage just goes down and then the vcore and memory clocks won't match up. Then you end up with an even more stuttery mess. Lets take a new game for example: Amazons New World. Despite how much of a trash game it is, its still a good example. My 6900XT get around 45-55fps in this game maxed out at 1440p and its the exact same if I upscale it to 5k, and it microstutters and lags constantly. My 3070 on the other hand had a nice clean 120+fps no problem and ran smooth as butter.

Again, this is just games that don't force the card to run. Stuff like Warframe, Dragon Age, Mass effect, Mechwarrior 5, etc that I like to play all become like this. But heavy games like Cyberpunk or Battlefield run extremely well. I've spent hours and hours tweaking in the radeon software trying different combinations of the things to get it to run but in the end I just had to give up some games. The downclocking is so aggressive that it even causes problems with video playback sometimes, like youtube will start to skip frames when the card reaches 20mhz idle speed. AMD just went way way way too far trying to make it have a power efficiency similar to that of Nvidia.

AMD's own forums are filled with the same complaint - card stuck at low clock speed in XYZ game. Everything from the original Vega 56/64 to the newest 6000 series cards has had the same issues. I'm betting that the reason we don't see more complaints is that a lot of people still run 1080p/60hz with vsync on and don't really notice it that much.
First I've heard of it, seems to be mostly a 5000 series issue with some models of 5600/5700 looking at AMD forums. In any case not filled with same complaint. As for games that don't play nice with AMD there are those as well for Nvidia cards. None of the reviewers seemed to have come across this as well with the 6000 series, unless they are lying. Sounds like a setup issue which many of the few complaints I found turned out to be. Then again I rarely play older games. Did play some Return to Castle Wolfenstein last night, rather old game, ran like a champ on the 6900XT, surprised it even ran but no issues, N-Patch didn't work though. I would expect that game to be limited by CPU latency with the GPU virtually idle. You can see GPUs with modern games at low resolutions at 50%to 60% GPU speed since CPU limited and nothing to do with the GPU having an issue. The 6900XT is extremely powerful and other limitations from CPU, latency, PCIe etc. can come into play with older games.
 
First I've heard of it, seems to be mostly a 5000 series issue with some models of 5600/5700 looking at AMD forums. In any case not filled with same complaint. As for games that don't play nice with AMD there are those as well for Nvidia cards. None of the reviewers seemed to have come across this as well with the 6000 series, unless they are lying. Sounds like a setup issue which many of the few complaints I found turned out to be. Then again I rarely play older games. Did play some Return to Castle Wolfenstein last night, rather old game, ran like a champ on the 6900XT, surprised it even ran but no issues, N-Patch didn't work though. I would expect that game to be limited by CPU latency with the GPU virtually idle. You can see GPUs with modern games at low resolutions at 50%to 60% GPU speed since CPU limited and nothing to do with the GPU having an issue. The 6900XT is extremely powerful and other limitations from CPU, latency, PCIe etc. can come into play with older games.

You are actually seeing it, it shows up different degrees depending on the game (I'm talking about cpu limited gaming). As for cpu limitations, my 10700T+1660ti is about as fast as my 5900x+6900xt at 1080p. Yes, my power limited 10700T@about 3.8 ghz.

*note in a specific spot in my base, that represents a good percentage of the game* People who play fallout 76, know exactly what I'm talking about. (fps swings in fallout 76 are pretty wild, especially depending on lighting/shadows)

at 1440p my 6900xt destroys the 1660ti for obvious reasons.


I have a specific spot in my base that I stand on. So it's fairly repeatable.

Yes, fallout 76 is very single core thread heavy and poorly optimized. That doesn't change that it's a game I play and enjoy... unfortunately. :confused:

It also doesn't change the fact that my 6900xt gets 50% less fps than my 3070ti did. Nor does it change the fact that both the 6900xt and 3080 are cpu limited.

6900xt@ about 1200 (full screen, with/without cool, fps limiter, etc
1440p (everything maxed) even min core limiting it, I get the same fps


1638418674283.png




rtx 3070ti
core stays 1970-2010 (I don't have it anymore so I can't exactly show the same picture)
502480_1631749719404.png


My base

*yes the 6900xt full upper fps swings are much higher, but it makes no difference when your mins are garbage*
 
Last edited:
This is a well known problem, its been around since Vega launched. These are just the latest and greatest to be affected. If you set the clock speed manually the usage just goes down and then the vcore and memory clocks won't match up. Then you end up with an even more stuttery mess. Lets take a new game for example: Amazons New World. Despite how much of a trash game it is, its still a good example. My 6900XT get around 45-55fps in this game maxed out at 1440p and its the exact same if I upscale it to 5k, and it microstutters and lags constantly. My 3070 on the other hand had a nice clean 120+fps no problem and ran smooth as butter.

Again, this is just games that don't force the card to run. Stuff like Warframe, Dragon Age, Mass effect, Mechwarrior 5, etc that I like to play all become like this. But heavy games like Cyberpunk or Battlefield run extremely well. I've spent hours and hours tweaking in the radeon software trying different combinations of the things to get it to run but in the end I just had to give up some games. The downclocking is so aggressive that it even causes problems with video playback sometimes, like youtube will start to skip frames when the card reaches 20mhz idle speed. AMD just went way way way too far trying to make it have a power efficiency similar to that of Nvidia.

AMD's own forums are filled with the same complaint - card stuck at low clock speed in XYZ game. Everything from the original Vega 56/64 to the newest 6000 series cards has had the same issues. I'm betting that the reason we don't see more complaints is that a lot of people still run 1080p/60hz with vsync on and don't really notice it that much.
I had a Vega 64 and never experienced anything like that. I played a fairly wide range of games and of course the usual video playback.i ran it on a 4K display and never had a performance issue with the card. It would consistently perform as expected (between 1080 and 1080TI).
 
I had a Vega 64 and never experienced anything like that. I played a fairly wide range of games and of course the usual video playback.i ran it on a 4K display and never had a performance issue with the card. It would consistently perform as expected (between 1080 and 1080TI).
AMD 290, Vega 64 LC, Vega FE (2x), 5700x -> 6900XT. I've also never seen this. I had issues with 1080Ti(2x), 1070 (2x) minor and all due to configuration, Windows etc. Granted there will be games that run better on one or the other and just pointing out a few games that does not run well on one or the other does not point to a huge problem on neither but more to the game itself, not updated, conflicts with newer operating systems and subsystems such as DX and at times Drivers. Anyways I buy new cards to run new games mostly, if I spent a significant amount of time on older games I would not have to upgrade, maybe never.

If ability to play old games is what needed then neither the 3080 or the 6900Xt makes much sense unless one has a old favorite or two that cannot go without while playing what the new cards can do with new games. In that case just verify which card will play the game significantly well enough, it could be the 6900XT plays the older game not as well but still fast enough and plays newer games/titles going to play much better than the 3080. User/buyer has to decide.
 
You are actually seeing it, it shows up different degrees depending on the game (I'm talking about cpu limited gaming). As for cpu limitations, my 10700T+1660ti is about as fast as my 5900x+6900xt at 1080p. Yes, my power limited 10700T@about 3.8 ghz.

*note in a specific spot in my base, that represents a good percentage of the game* People who play fallout 76, know exactly what I'm talking about. (fps swings in fallout 76 are pretty wild, especially depending on lighting/shadows)

at 1440p my 6900xt destroys the 1660ti for obvious reasons.


I have a specific spot in my base that I stand on. So it's fairly repeatable.

Yes, fallout 76 is very single core thread heavy and poorly optimized. That doesn't change that it's a game I play and enjoy... unfortunately. :confused:

It also doesn't change the fact that my 6900xt gets 50% less fps than my 3070ti did. Nor does it change the fact that both the 6900xt and 3080 are cpu limited.

6900xt@ about 1200 (full screen, with/without cool, fps limiter, etc
1440p (everything maxed) even min core limiting it, I get the same fps


View attachment 417702



rtx 3070ti
core stays 1970-2010 (I don't have it anymore so I can't exactly show the same picture)
View attachment 417707

My base

*yes the 6900xt full upper fps swings are much higher, but it makes no difference when your mins are garbage*
There are plenty of reviews of the 6900XT and 3080 and other current generation of cards, I do not see mins are garbage. Of course nitpicking a game or two is always or most likely possible on any card. Now when you run out of Vram, you will hit some pretty severe hitching and minimums, the 3080 is much more likely to hit that wall before the 6900XT.
 
So here's the deal.

If you want raw performance in traditional raster graphics, the 6900xt is king.

If you want to run titles where you will want to enable any RT what so ever, or use upscaling like DLSS, you should get the 3080.

I happen to have a highly binned pretty close to golden sample 6900xt on an XFX board with a custom VRM design for max performance under water in an overkill water loop.

Things like Time Spy in raster mode come out with fantastic numbers:

23411.PNG


link

In most traditional raster based games it will beat anything Nvidia brings to the table, even the 3090 at native resolution and without RT.

At 4k Ultra in Cyberpunk 2077 native resolution no RT in the busy outdoor environments with max crowd sizes, I am getting 48-50fps, which is insanely good for this title.

If I turn out Ultra RT on top of that, the framerate drops to 12-14fps. In other words, unplayable.

I also tried 4k Ultra Medium RT, and the framerate was 22-24fps. Also not very playable.

These are framerates that can't even be saved with upscaling. I mean, if DLSS were available, it might be a good option, but DLSS is Nvidia only. FSR is technically not available in Cyberpunk (but there are ways to do it anyway) but the most you are going to be able to get without highly noticeable quality impacts is about 33%.

So the brute rendering force is there in the 6900xt, but since it lacks some of Nvidias proprietary solutions, it can't keep up if you turn on RT or want scaling.

As luck would have it, I didn't think there was a HUGE difference between RT on and off in Cyberpunk, so I am happy playing it with RT off. I do kind of wish I could have gotten my hands on a 3090 or 3080ti instead though.

But as they say, if wishes were horses, beggars would ride.

Use that information, consider the titles you want to play, and make your decision.
 
Last edited:
So I haven't had any clocking issues with my AMD cards (Two different 6800XTs) on either older or newer titles. From my perspective I don't think you can go wrong here.
 
3080 all the way. I gave up my 3070 when I picked up a 6900XT from the forum here and it was a bad choice. The card is great at high end games, but play anything a little older like revisiting titles you played in the past and it turns itself inside out trying to downclock to save power. Some old games I like to play are completely unplayable at all, the card will drop down to like 80mhz and you get 9fps in a game from 2011 that runs fine on intel integrated. Also, the AMD drivers are just plain weird, like if I turn my monitor go to sleep and then wake it up all the HDMI audio devices reinstall themselves and cause all sorts of havok. All of these things worked perfectly fine on the 3070, and in some games its performance was leagues above simply because AMD seems incapable of determining a game is actually running. Don't even try to come at me with the DDU crap, I reinstalled windows from scratch when I switched cards.

TLDR - unless you play benchmarks take the 3080

I haven't seen this in any title. I just played through 71 hours of the original Borderlands (and expansions) at a flat out maxed out 120fps (max my monitor can handle) without any problems. And it was released in 2009.

The GPU was clocking down based on need, but never did so beyond the level where it was maxing out performance.

I didn't have a problem at stock settings, but if you do, you can actually manually a minimum clock in the AMD Adrenaline drivers overclock page.

This may have been a historical driver bug or something. I don't know what to tell you. I've had my 6900xt for about 6 weeks now and used it in a variety of titles, new and old, and have never seen this problem.
 
I haven't seen this in any title. I just played through 71 hours of the original Borderlands (and expansions) at a flat out maxed out 120fps (max my monitor can handle) without any problems. And it was released in 2009.

The GPU was clocking down based on need, but never did so beyond the level where it was maxing out performance.

I didn't have a problem at stock settings, but if you do, you can actually manually a minimum clock in the AMD Adrenaline drivers overclock page.

This may have been a historical driver bug or something. I don't know what to tell you. I've had my 6900xt for about 6 weeks now and used it in a variety of titles, new and old, and have never seen this problem.

Alright, I give up. AMD is god, god cannot do any wrong. There are definitely NOT hundreds of threads on the AMD forum about the same exact problem. Its all in my head.
 
Alright, I give up. AMD is god, god cannot do any wrong. There are definitely NOT hundreds of threads on the AMD forum about the same exact problem. Its all in my head.
I have a 3090,3080,3070,2080ti,6800XT, and had a second 6800xt and a 5700XT before and haven’t encountered this. I suspect there’s more involved. Can you give a game example for us to test? I game more on the 6800 than anything else.
 
Alright, I give up. AMD is god, god cannot do any wrong. There are definitely NOT hundreds of threads on the AMD forum about the same exact problem. Its all in my head.

Never said it was in your head. Just saying that I have not experienced it with my 6900xt. As with everything YMMV, but my guess is that this is an issue that is resolved at this point, or I would be seeing it too.
 
I have a 3090,3080,3070,2080ti,6800XT, and had a second 6800xt and a 5700XT before and haven’t encountered this. I suspect there’s more involved. Can you give a game example for us to test? I game more on the 6800 than anything else.

He gave an example as did I.

However, I just tested this on

Crysis 2 Maximum edition, shows good gpu usage... till you look at the frequency. 5-700mhz. *note I haven't played around with settings, but as is default this is what I got*

1638485716738.png
 
He gave an example as did I.

However, I just tested this on

Crysis 2 Maximum edition, shows good gpu usage... till you look at the frequency. 5-700mhz. *note I haven't played around with settings, but as is default this is what I got*

View attachment 417936
I’ll give that a try. His examples were mostly MMOs- I don’t play those. Can’t judge new world or war frame or fallout money-grab edition.
 
Back
Top