5800X3D gaming review

In the end, if you're the guy who buys cutting edge and rides it for a decade (2500k doesn't seem to indicate that) than waiting for AM5 might be the right choice. If not, you could build a fast, solid system for not a lot of money and have a huge upgrade vs your 2500k right now.
Cutting edge perhaps but never top tier. I have no need for an I7 over a 5 for example nor need 32+GB of ram. I work on servers/networks all day long, don't need a powerhouse system at home for the little I use it these days. My played games for the last month were BloonsTD and Toe Jam and Earl. Hah.

I have a hard time getting rid of hardware that works. So much e-waste in the world. I'll probably wait till AM5 hits the masses and buy AM4 on clearance.
 
If your old enough to remember the days of FX , I would says history repeats itself yet again .

If you're old enough to remember the days of the K6, I'd say this could turn out like the K6-III getting released 4 months before the Athlon. Adding 256K on-die L2 cache to the K6-2 meant that the 1-2MB of motherboard L2 cache became an L3 and the combination allowed it to match or beat a higher-clocked P3 in many applications even including some (but not all) games where its inferior FPU had otherwise placed it at a huge disadvantage. Then the Athlon was released as a completely new architecture with a combination of higher clock speeds, higher IPC, and higher memory bandwidth. That chip just decimated everything, and Socket-7's last harrah was quickly forgotten.
 
Makes me wonder if the 7000s wil come with 3D cache to start or if it will be used for a refresh.
rumor mill has been churning out plausible tales that Ryzen 7000 with V-Cache won't be seen until 2023, well after the initial launch, due to extra time needed to tweak the 3D cache packaging on top of regular New Product design stuff. It sounds like it won't be a default feature, but not quite a refresh either- more like a higher tier of products that launches later.
(edited for clarity)
 
rumor mill has been churning out plausible tales that Ryzen 7000 with V-Cache won't be seen until 2023, well after the initial launch, due to extra time needed to tweak the 3D cache packaging on top of regular New Product design stuff. It sounds like it won't be a default feature, but not quite a refresh either- more like a higher tier of products that launches later.
TSMC has released their 3D-Vcache design notes and it seems it is currently limited to 7nm and they don't yet have it working on their smaller nodes, the direct bonding process they have seems to have alignment issues at that scale, but 2023 is a very reasonable time frame to see consumer products with it, Q3 is a big time for enterprise sales, so much of Q2's production is aimed at that critical refresh window. Q4 is a huge time for the consoles, so much of Q3's production timetable is allocated to those, which leaves Q4 & Q1 as the consumer chip start time, so Q2 2023 would be a reasonable time to see 3D V-Cache consumer 7000 series processors in the retail space.
 
Last edited:
If you're old enough to remember the days of the K6, I'd say this could turn out like the K6-III getting released 4 months before the Athlon. Adding 256K on-die L2 cache to the K6-2 meant that the 1-2MB of motherboard L2 cache became an L3 and the combination allowed it to match or beat a higher-clocked P3 in many applications even including some (but not all) games where its inferior FPU had otherwise placed it at a huge disadvantage. Then the Athlon was released as a completely new architecture with a combination of higher clock speeds, higher IPC, and higher memory bandwidth. That chip just decimated everything, and Socket-7's last harrah was quickly forgotten.
Good pull from the history banks, and yes oddly similar. Actual competition brings us these competitive efforts.
 
Being that he was a pro MSI OC'er, he had access to a chip/s and plenty of mobo's that may or may not have had some kind of secret sauce and black magic to pull it off, but still smells like a suicide run to me.
Especially since none of those HWBOT OC's would last 20s in the real world.
In the end, no matter the means, it still qualifies as an OC.
Well, an employee at MSI with custom MB, custom BIOS and brute forcing with LN2 is a grand canyon sized leap to then declare "it's overclockable". Certainly isn't a scenario thats going to be relevant to 99.9999% of the customer base. If anything, that minimal of an OC with that extreme of measures just tells us how overclockable it _isn't_.

Same guy "overclocked" 12900KS to 7.5Ghz with LN2. Its fun clickbait fodder but not exactly practical for realworld users. https://wccftech.com/intel-core-i9-...records-with-asrock-z690-aqua-oc-motherboard/
 
Last edited:
I was thinking about picking one of these up to replace my 5950X. Reason being 90% of my usage on this PC is gaming and my 2 x 360mm radiator custom water cooling loop is having a little bit of trouble keeping my 5950X and 3090 cool while staying under 40°C coolant temp. But from what I’ve seen the 5800X3D doesn’t appear to use much less power than my 5950X. So I’m not sure if it’s worth it.
 
I was thinking about picking one of these up to replace my 5950X. Reason being 90% of my usage on this PC is gaming and my 2 x 360mm radiator custom water cooling loop is having a little bit of trouble keeping my 5950X and 3090 cool while staying under 40°C coolant temp. But from what I’ve seen the 5800X3D doesn’t appear to use much less power than my 5950X. So I’m not sure if it’s worth it.
It won’t, it may drop you 1-2 degrees but the OC your probably maintaining on your 5950x probably puts it pretty close performance wise as the 5800x3D, you may see slightly better minimum FPS drops but nothing significant at 1440p and higher.
 
As of opening, the Dallas Microcenter reported "25+" 5800x3Ds in stock. Now? 2.
Denver still showing 25+.

They also have good prices on the rest of the stack. Frankly doesn't look like many are biting at $450. Edit: Assuming the website numbers are correct.
 
I updated my ryzen master, and the all cores worked but the per core made my system bsod with machine check exception regardless of what i did. Any thoughts?
Did you clear all your bios Curve Opt. values before you ran the per core optimization. I set everything but memory timings and voltage back to default in bios before running the optimizer and it's been rock solid since. Just for kicks I tried adding 25 mhz to the boost level and it crashes constantly so it seems to push it to the edge of stability. The CO values it spit out after running the optimizer were far more negative than what I was using. For simplicity I had set -20 all cores. The optimizer set 5 cores at -29 2 cores at -27 2 cores at -24 2 cores at -20 and 1 at -19. The changes equated to a small boost to single core and a more significant all core boost according to my testing using cinebench R20. I then manually enter the values in bios.
 
Did you clear all your bios Curve Opt. values before you ran the per core optimization. I set everything but memory timings and voltage back to default in bios before running the optimizer and it's been rock solid since. Just for kicks I tried adding 25 mhz to the boost level and it crashes constantly so it seems to push it to the edge of stability. The CO values it spit out after running the optimizer were far more negative than what I was using. For simplicity I had set -20 all cores. The optimizer set 5 cores at -29 2 cores at -27 2 cores at -24 2 cores at -20 and 1 at -19. The changes equated to a small boost to single core and a more significant all core boost according to my testing using cinebench R20.
I tried that after the first BSOD and had no joy, the only thing I changed after the first load optimized values was to set the qcode screen to show temperature after boot. I never screwed with curve optimization in the uefi. Even with no DOCP memory it wouldn't work with the below settings.
1650483953956.png
 
Last edited:
I tried that after the first BSOD and had no joy, the only thing I changed after the first load optimized values was to set the qcode screen to show temperature after boot. I never screwed with curve optimization in the uefi. Even with no DOCP memory it wouldn't work with the below settings.
View attachment 465712
What do you have in there for the actual memory sticks. I’ve found many, Corsair being the worst for it, are not stable at the stock DOCP profiles and you need to up the memory voltage a little.
 


Some pretty impressive gains vs the non-X3D. Sometimes even at 1440p and 4k.
 
  • Like
Reactions: noko
like this


Some pretty impressive gains vs the non-X3D. Sometimes even at 1440p and 4k.

Seems pretty game dependent. 1440p, definitely some that benefit as it approaches a 20 FPS increase in performance, but for 4k, seems like a what for to me. 1080p though, nice boost.
 
Seems pretty game dependent. 1440p, definitely some that benefit as it approaches a 20 FPS increase in performance, but for 4k, seems like a what for to me. 1080p though, nice boost.
I am more tempted than I thought I would be to replace my 3900x with it...
 


Some pretty impressive gains vs the non-X3D. Sometimes even at 1440p and 4k.

Impressive gains, especially on the lows where it really counts. I wonder what the performance impact would be with a 6900XT? Don't know, kinda tempting in any case.
 
the Cities: Skylines results in the most recent HW Unboxed vid are most interesting... Been wondering if V-Cache would help in all CPU-bound gaming scenarios or just high-framerate ones, and that shows some respectable gains even in the sub-60fps zone. Anyone remember Supreme Commander? Maybe 5800X3D is finally the CPU that will run an 81x81 map endgame with max unit cap, heh

If AMD releases a 5900X3D after all, I swear I will dump my 5900X on Marketplace and go directly to Microcenter. Cyberpunk 2077 with lots of mods is hellish on the CPU and even going from 5800X -> 5900X wasn't quite enough for 60fps locked bc it needs lots of threads and really high ST performance (yeah I know the game does like 120fps in reviews but that's with no mods or ReShade)
 
Dallas Microcenter is sold out. But Houston as of lunch time still had some.
 
What do you have in there for the actual memory sticks. I’ve found many, Corsair being the worst for it, are not stable at the stock DOCP profiles and you need to up the memory voltage a little.
Crucial
 
Hmmm never had that problem with Crucial, but I have only ever used their stuff in Intel systems. I would see if the Bios still lets you do a memory overclock via autotune and work from there.
 
Hmmm never had that problem with Crucial, but I have only ever used their stuff in Intel systems. I would see if the Bios still lets you do a memory overclock via autotune and work from there.
DOCP has worked fine since 3rd July 2021 but curve optimization has always thrown fits.
 
the Cities: Skylines results in the most recent HW Unboxed vid are most interesting... Been wondering if V-Cache would help in all CPU-bound gaming scenarios or just high-framerate ones, and that shows some respectable gains even in the sub-60fps zone. Anyone remember Supreme Commander? Maybe 5800X3D is finally the CPU that will run an 81x81 map endgame with max unit cap, heh

If AMD releases a 5900X3D after all, I swear I will dump my 5900X on Marketplace and go directly to Microcenter. Cyberpunk 2077 with lots of mods is hellish on the CPU and even going from 5800X -> 5900X wasn't quite enough for 60fps locked bc it needs lots of threads and really high ST performance (yeah I know the game does like 120fps in reviews but that's with no mods or ReShade)

my son and i still play Supreme Commander:FA with the FAF community mods. last weekend it locked up at the 1hr 1min mark. it tends to do that even with the core maximizer speedup util.
 
Impressive gains, especially on the lows where it really counts. I wonder what the performance impact would be with a 6900XT? Don't know, kinda tempting in any case.
I thought that myself, as all the reviewers seem to be using a 3090 in the review. would like to see it paired with a lesser video card like a 6600XT or 3060RTX just to see if it's still offering uplifting performance in 1080p /1440p for that type of buyer, AMD website has it in stock at $449 right now.

We have people that have been paying that much for the baseline of newer video cards and only fear (fair) to show it.
 
Just dropped in my 5800X3D today and got over 50% uplift in the Warhammer 3 campaign benchmark compared to my 5600X:
1651286031419.png


The battle benchmark was much less eventful, virtually identical at 62.2 fps versus 62.7 fps on the 5800X3D. Still very pleased with that massive campaign map uplift.
 
Just dropped in my 5800X3D today and got over 50% uplift in the Warhammer 3 campaign benchmark compared to my 5600X:
View attachment 468955

The battle benchmark was much less eventful, virtually identical at 62.2 fps versus 62.7 fps on the 5800X3D. Still very pleased with that massive campaign map uplift.

I hate to rain on your parade but the campaign map is known to be rather buggy and have graphical issues.
 
I'm guessing the situation is that the battle is completely gpu restricted while the map was cpu restricted.

I'd love to see if someone put a 5800X through Cities Skylines or any of the Frontier simulation titles. (Jurassic Park Evolution, Planet Coaster, etc.) Situations where I'm definitely limited by single thread performance in some way.
 
I'm guessing the situation is that the battle is completely gpu restricted while the map was cpu restricted.

I'd love to see if someone put a 5800X through Cities Skylines or any of the Frontier simulation titles. (Jurassic Park Evolution, Planet Coaster, etc.) Situations where I'm definitely limited by single thread performance in some way.

Grab ImKibitz's Satisfactory save and see the difference. Makes my 5800X cry for uncle. Made the 3900XT it replaced cry for mommy.
 
Grab ImKibitz's Satisfactory save and see the difference. Makes my 5800X cry for uncle. Made the 3900XT it replaced cry for mommy.
No! Don't tell me that! Satisfactory might be the only "demanding" title I actually play...

The last thing my wife needs to hear is "Baby, I think I need a new CPU..."

On another note, I'd actually like to see if this helps on SimCity 4. I remember that game getting really laggy later on due somewhat to being very single threaded. Maybe that's become a non-issue on modern CPUs.
 
No! Don't tell me that! Satisfactory might be the only "demanding" title I actually play...

The last thing my wife needs to hear is "Baby, I think I need a new CPU..."

On another note, I'd actually like to see if this helps on SimCity 4. I remember that game getting really laggy later on due somewhat to being very single threaded. Maybe that's become a non-issue on modern CPUs.

Diablo 3 in higher GRs is quite punishing as well :D.
 
Just dropped in my 5800X3D today and got over 50% uplift in the Warhammer 3 campaign benchmark compared to my 5600X:
View attachment 468955

The battle benchmark was much less eventful, virtually identical at 62.2 fps versus 62.7 fps on the 5800X3D. Still very pleased with that massive campaign map uplift.

Now I'm curious to see other "unoptimized" games to see how they perform with a 5800X3D. Like ARK, BeamNG with AI, PUBG, NoMan'sSky, Rust, etc. I bet MMOs like Warframe (the open world maps) get massive uplifts too. Does anyone have any benchmarks of games like that yet??? I can't find any...
 
Last edited:
Back
Top