Offsets, Curves, And Overclocking, Oh My!

ShuttleLuv

Supreme [H]ardness
Joined
Apr 12, 2003
Messages
7,295
So what do you guys prefer? Offsets and Curves, or Dialing in settings like the old days? I am old school, I prefer the detailed dialing in, but I completely understand the curves and offsets. I think it makes alot of sense for newbies and bringing overclocking to the mainstream. It gives a certain satisfaction thats "good enough" for most entry level overclockers. The only thing is, I would hate to see things dumbed down so bad that overclocking becomes completely irrelevent. I feel we may be headed that direction unfortunately. :(

Do you guys think it's the same level of fun or more fun, or just worse??
 
Good question.
I definitely felt more in control during the S370, K8 and C2D days when it came to overclocking.
This time around I had an almost 10 year long hiatus, and I'm basically a child in the fog with my Ryzen and Gigabyte board. PBO, XFR, BCLK, TMA*, man!

Like, imagine my bewilderment when I undervolted the Ryzen to some stupidly low voltage, got all excited when it booted and Primed for an hour, only to realize it's silently skipping clocks. I was not prepared.
It doesn't help that my board is low-end and it really shouldn't overclock at all, but due to some weird BIOS quirk you can get it to raise the multiplier even.

My opinion on this? It's better, fo-sho :D I always got excited at any new BIOS option in the older days, so now the board is my oyster and I'll be studying it for a long time before I know what's up. The replay value is there :D


* TMA - Too many acronyms :p
 
I prefer old school, but I'm not sure how practical that is with today's thermally dense CPUs and those wonderful AVX instructions that give them split personalities.

If you do a manual overclock, you have to either leave a lot of performance on the table or accept that the chip is going to either crash, thermal throttle, or degrade if you run something very heavy.

That being said, I really liked the behavior of the Ryzen 5800X I set up for someone. It would boost single core loads to [email protected]. All-core loads were almost universally [email protected], and Prime 95 would dip down to [email protected]. It seemed like it was capable of getting every ounce of performance out of the processor by adjusting on the fly, but that it was just a little detuned. I played with some manual overclocking (as much as I could on a machine that wasn't mine ;)), and I found that those voltages were about 0.05-0.1V higher than they had to be for those speeds and loads. The motherboard was a humble ASUS Prime B550 Plus, so I would imagine that on a better motherboard those voltages could come down even more (with correspondingly higher target clock speeds).

The problem is that the boost algorithm is never one-size-fits-all. If you have better than normal cooling, or you have a better than normal motherboard, or you have a better-than-normal CPU sample, chances are that you can pick up 50-200MHz across the board by playing with the curve. And that's on the 5800X, which is almost maxed out on the stock TDP. The 5600X, 5900X, and 5950X are all much more constrained by their respective TDPs, and there is much more performance left on the table with those chips, especially for all-core loads on the latter 2. This is where choosing the right combination of PPT/TDC/EDC can be something of a "black art." I've followed threads where people spent weeks trying to find the perfect combination of variables. I don't need a CPU upgrade right now, but I want to buy one just to "take the challenge." It reminds me a little of setting up a custom BIOS on a Kepler or Maxwell card, and that's something that I've enjoyed doing, even though I was still making tiny changes for years before I finally got it perfect...
 
Last edited:
I prefer old school over the modern "Curve, Offsets". Pretty much all the rigs I had from AMD's Athlon to my last 7700K, I can dial it in with ease and get the max stable clocks. My 7700K I was able to get 5.0Ghz right out the starting gate with 1.32v rock steady stable and never had to delid the chip.

Upgrading to the Ryzen 5800X was quite the challenge to even get a stable clock of 4.7Ghz. No matter what I tried i was rewarded with bloo screens and reboots using the "Curve" or that horrible Ryzen master proggy. I was about to give this pos to my son and get the 7700K back. Decided to go old school via bios ( multi + Voltage ) and I managed to get 4.85Ghz with 1.34v and ram at 1866. Even with all the work I'm not happy with this new AMD rig. I feel it was a waste of money. I'm gearing to go back to intel.
 
I prefer old school over the modern "Curve, Offsets". Pretty much all the rigs I had from AMD's Athlon to my last 7700K, I can dial it in with ease and get the max stable clocks. My 7700K I was able to get 5.0Ghz right out the starting gate with 1.32v rock steady stable and never had to delid the chip.

Upgrading to the Ryzen 5800X was quite the challenge to even get a stable clock of 4.7Ghz. No matter what I tried i was rewarded with bloo screens and reboots using the "Curve" or that horrible Ryzen master proggy. I was about to give this pos to my son and get the 7700K back. Decided to go old school via bios ( multi + Voltage ) and I managed to get 4.85Ghz with 1.34v and ram at 1866. Even with all the work I'm not happy with this new AMD rig. I feel it was a waste of money. I'm gearing to go back to intel.

All of that stock boosting behavior is going to degrade these chips over time. I've already seen a 5800X lose 50MHz in 5 months with no overclocking at all, except for the 2 brief tests I ran at 1.30V fixed. When it was new, it could loop Cinebench at 4.70. 5 months later, it can only do 4.65. Of course, the stock all-core speed/voltage is only 4.45 at 1.35V, so the user would never notice until it got a whole lot worse... how long is the warranty again?
 
All of that stock boosting behavior is going to degrade these chips over time. I've already seen a 5800X lose 50MHz in 5 months with no overclocking at all, except for the 2 brief tests I ran at 1.30V fixed. When it was new, it could loop Cinebench at 4.70. 5 months later, it can only do 4.65. Of course, the stock all-core speed/voltage is only 4.45 at 1.35V, so the user would never notice until it got a whole lot worse... how long is the warranty again?
Well, that's troubling.
Did it become unstable and you had to lower something manually?
Or do you mean the automatic boost it chooses for itself?
I'm asking, because my noobish preliminary messing with my Ryzen shows a strict and synchronous relation of temperature and clock.
Pic attached, prime95 'hottest' test.
 

Attachments

  • temp vs clock.jpg
    temp vs clock.jpg
    264.6 KB · Views: 0
Well, that's troubling.
Did it become unstable and you had to lower something manually?
Or do you mean the automatic boost it chooses for itself?
I'm asking, because my noobish preliminary messing with my Ryzen shows a strict and synchronous relation of temperature and clock.
Pic attached, prime95 'hottest' test.

Automatic boost still works the same way it always did. There is no instability at default settings.

The problem is that when I first built it, I checked to see what kind of overclocking headroom the chip had. Sometimes I throw a mild overclock on computers I build for family members, if it seems worthwhile. I set a voltage of 1.30 and a multiplier of 47.0. That turned out to be the highest multiplier at which it could reliably run Cinebench at 1.3V. This was not really much faster than stock (nor is Cinebench a very difficult test, so a 24/7 stable OC would end up being lower), so I loaded BIOS defaults and decided not to bother. 5 months later, I'm very close to upgrading my own machine, so I decided to play around with her PC a little to see if maybe I could get a little more out of it if I spent a little more time. You know, as kind of a try before I buy. What I found was that it actually had 50MHz less headroom than it did before. Now if I try to run Cinebench at 4.7, it instantly reboots. At 4.675, Cinebench crashes during the second or third run, and if I go down another 25MHz, the test can finally pass. So the chip is still good enough to run at factory boost levels, but it's 50MHz worse than it was before if I try to throw a manual OC on it.
 
So the chip is still good enough to run at factory boost levels, but it's 50MHz worse than it was before if I try to throw a manual OC on it.
That's disconcerting. Good to know, thanks.
Maybe related, but one gamer's nexus clip on Youtube mentioned actual voltages (multimeter tested) being actually even higher, so a seemingly safe voltage can 1) be actually dangerous and 2) indeed cause degradation, and they observed that apparently.

5 months is, well, nothing. At least to me, I run stuff for years.
 
That's disconcerting. Good to know, thanks.
Maybe related, but one gamer's nexus clip on Youtube mentioned actual voltages (multimeter tested) being actually even higher, so a seemingly safe voltage can 1) be actually dangerous and 2) indeed cause degradation, and they observed that apparently.

5 months is, well, nothing. At least to me, I run stuff for years.

That's a very good point. Another possible factor affecting this sample is motherboard quality. The board is an ASUS Prime B550 Plus. While it's not the worst, it's far from the best. I don't have the ability to measure it, but I would imagine that the voltage overshoot during transient loads is significantly worse on this board than it would be on something like a Dark Hero or Aorus Master. It is very likely that any degradation would be worse with a low-end board.
 
All of that stock boosting behavior is going to degrade these chips over time. I've already seen a 5800X lose 50MHz in 5 months with no overclocking at all, except for the 2 brief tests I ran at 1.30V fixed. When it was new, it could loop Cinebench at 4.70. 5 months later, it can only do 4.65. Of course, the stock all-core speed/voltage is only 4.45 at 1.35V, so the user would never notice until it got a whole lot worse... how long is the warranty again?
Warranty is 3 years on this. I might be overthinking the overclocking procedure for this particular chip, I just find it cumbersome that you have to jump through all these hoops just to do a simple overclock. PBO this and Curve that. Last time I had an AMD was an FX55 (s939). I'm forced to leave it @ stock and let it Auto overclock/boost or whatever it's called. To me this thing isn't in the same performance league as my last Intel. I let my "I want to upgrade" bug get to me and saw/read great things from Ryzen. However I am not really impressed. I'd be happy to dump this thing for a Core i9-11900K & a Z590 board.

*Edit* Just found out MSI yanked the beta bios (1.72) and replaced it with 1.70. Maybe this will help fix the issues I'm having.
 
Last edited:
Warranty is 3 years on this. I might be overthinking the overclocking procedure for this particular chip, I just find it cumbersome that you have to jump through all these hoops just to do a simple overclock. PBO this and Curve that. Last time I had an AMD was an FX55 (s939). I'm forced to leave it @ stock and let it Auto overclock/boost or whatever it's called. To me this thing isn't in the same performance league as my last Intel. I let my "I want to upgrade" bug get to me and saw/read great things from Ryzen. However I am not really impressed. I'd be happy to dump this thing for a Core i9-11900K & a Z590 board.

*Edit* Just found out MSI yanked the beta bios (1.72) and replaced it with 1.70. Maybe this will help fix the issues I'm having.

You feel like it's not in the same performance league as your 7700K? I would think that a 5800X would crush a 7700K a hundred different ways. I also think you'd find that overclocking an 11900K is just as "new school" as overclocking the Ryzen.
 
You feel like it's not in the same performance league as your 7700K? I would think that a 5800X would crush a 7700K a hundred different ways. I also think you'd find that overclocking an 11900K is just as "new school" as overclocking the Ryzen.
Trust me, it's not. my 7700K @ 5.0Ghz will surpass the 5800X in several benchmarks and vice-versa. Maybe to you the 5800X is better over the 7700K, but to ME it's not. Yeah more cores blah blah. A buddy of mine has almost the same exact setup as mine except he has a 5900X. He's been trying to get a stable moderate overclock since day one and he's just about to get a hammer to it. He gets the same issues as me and he's upgraded from a 5820K. He also prefers the 5820K over his 5900X. Not a coincedence.

I overclocked another buddy of mine's 11700K with ease. I don't think it's takes a rocket scientist to overclock a 11900K.

But thanks for the insight.
 
That's a very good point. Another possible factor affecting this sample is motherboard quality. The board is an ASUS Prime B550 Plus. While it's not the worst, it's far from the best. I don't have the ability to measure it, but I would imagine that the voltage overshoot during transient loads is significantly worse on this board than it would be on something like a Dark Hero or Aorus Master. It is very likely that any degradation would be worse with a low-end board.
Yeah, I imagine a better board will at least be the better tested one.
In the case of my low-end A 520 Aorus, if you look at the screens, the PBO limits are set in stone - PPT, TDC and EDC are all capped at 88W, 60A and 90A despite setting higher limits in the BIOS.
Doesn't stop me from setting a high voltage, cooling the crap out of it (so it maintains the higher clock) and thus experiencing degradation. Dunno - I'm very new to newer tech.
I like complicated TBH. I might get a better board for my Ryzen just to play around down the line.
 
Trust me, it's not. my 7700K @ 5.0Ghz will surpass the 5800X in several benchmarks and vice-versa. Maybe to you the 5800X is better over the 7700K, but to ME it's not. Yeah more cores blah blah. A buddy of mine has almost the same exact setup as mine except he has a 5900X. He's been trying to get a stable moderate overclock since day one and he's just about to get a hammer to it. He gets the same issues as me and he's upgraded from a 5820K. He also prefers the 5820K over his 5900X. Not a coincedence.

I overclocked another buddy of mine's 11700K with ease. I don't think it's takes a rocket scientist to overclock a 11900K.

But thanks for the insight.

Well, I came to same conclusion you did - all core is better than PBO. You end up with a tighter overclock going manual. You give up a little of the high-strung, high-voltage low-core boost, but it really doesn't cost any meaningful performance in anything but synthetic single-core tests. As for me, I'm going to stick with my i7-5960X for now. In some situations, it's just fast as the 5800X or slightly faster, in others it's as much as 30% slower. Zen 3's biggest bottlenecks are memory bandwidth and latency. I think I'll skip this generation and see what comes next with respect to ADL, Zen-3D, and the new TR.
 
I've been having very good results using PBO and Curve Optimizer on my 5900X. The Curve Optimizer is pretty amazing. I had a 3900X before (same motherboard), but the 3900X does not support the Curve Optimizer. PBO worked fine on the 3900X but it felt like the benefit over stock was much less without the curve optimizer. The curve optimizer with the 5900X really helps to maximize what you can get from your boost. I feel that with the curve optimizer, there is little reason to bother with an all-core overclock anymore. And since the best results from curve optimizer actually come from reducing the voltage, not increasing it, it seems to me that degradation is not going to be as big of a factor...

One of the reasons I'm so thrilled with Curve Optimizer is the amazing single-core performance. With PBO enabled, limits set to motherboard (Aorus X570 Ultra), +200 max boost, and Curve Optimizer set to -25 All Core, I'm seeing individual cores boost up to 5100Mhz. That's absolutely amazing for certain games which aren't heavily multi-threaded. It hits 5100Mhz while playing World of Warcraft, which helps tremendously considering that the game is very CPU limited, but not heavily multi-threaded. There are quite a few things where single-core performance still matters a lot. Curve Optimizer with PBO really helps you get the best of both worlds. Setting an All-Core manual overclock is more likely to gimp you at this point, unless you are doing something that maxes out all the cores consistently. You're not going to get an all-core manual overclock of 5100Mhz on Zen3... And locking all cores to 4.7 or whatever is leaving untapped single-core performance on the table.

For Comparison:

In Cinebench R20:

3900X (PBO):
Multi: 7114
Single: 512

5900X (PBO. Curve Optimizer Disabled):
Multi: 8386
Single: 613

5900X (PBO w/ Curve Optimizer -25 All Core):
Multi: 8704
Single: 632

A buddy of mine has almost the same exact setup as mine except he has a 5900X. He's been trying to get a stable moderate overclock since day one and he's just about to get a hammer to it. He gets the same issues as me and he's upgraded from a 5820K. He also prefers the 5820K over his 5900X. Not a coincedence

I came from a 5820K also. While it was a nice chip, and still does it's job in my backup system, it's quite frankly not even in the same league as my 5900X. If your friend prefers the 5820k, he's either doing something wrong or has no idea what he's doing at all.

For Comparison to the above figures (Cinebench R20):

5820k (stock):
Multi: 2207
Single: 318

5820k (4.5Ghz Manual All-Core Overclock):
Multi: 3136
Single: 426

When I went from the 5820k to the 3900X, it didn't seem like much of an upgrade. The single-core scores above reflect that, as they are still pretty close (426 for 5820k vs 512 for 3900X). But the 5900X is a massive bump in comparison. 426 for 5820k vs 632 for 5900X, we're talking about nearly a 50% increase in single-core performance at that point... Not to mention the much higher multi-core performance...
 
The thing with Cinebench is that it shows the performance of the cores and caches without hitting the RAM. The 5820K probably creams it in that department. Some games like SOTR and RDR2 are very sensitive to this.

My original 5820K with 6-year old Crucial 2400C16-rated RAM:

5820K 4.6.png


Upgraded later to a 5960X J-batch, same old RAM:

5960X at 4.75.jpg
 
I've been having very good results using PBO and Curve Optimizer on my 5900X. The Curve Optimizer is pretty amazing. I had a 3900X before (same motherboard), but the 3900X does not support the Curve Optimizer. PBO worked fine on the 3900X but it felt like the benefit over stock was much less without the curve optimizer. The curve optimizer with the 5900X really helps to maximize what you can get from your boost. I feel that with the curve optimizer, there is little reason to bother with an all-core overclock anymore. And since the best results from curve optimizer actually come from reducing the voltage, not increasing it, it seems to me that degradation is not going to be as big of a factor...

One of the reasons I'm so thrilled with Curve Optimizer is the amazing single-core performance. With PBO enabled, limits set to motherboard (Aorus X570 Ultra), +200 max boost, and Curve Optimizer set to -25 All Core, I'm seeing individual cores boost up to 5100Mhz. That's absolutely amazing for certain games which aren't heavily multi-threaded. It hits 5100Mhz while playing World of Warcraft, which helps tremendously considering that the game is very CPU limited, but not heavily multi-threaded. There are quite a few things where single-core performance still matters a lot. Curve Optimizer with PBO really helps you get the best of both worlds. Setting an All-Core manual overclock is more likely to gimp you at this point, unless you are doing something that maxes out all the cores consistently. You're not going to get an all-core manual overclock of 5100Mhz on Zen3... And locking all cores to 4.7 or whatever is leaving untapped single-core performance on the table.

For Comparison:

In Cinebench R20:

3900X (PBO):
Multi: 7114
Single: 512

5900X (PBO. Curve Optimizer Disabled):
Multi: 8386
Single: 613

5900X (PBO w/ Curve Optimizer -25 All Core):
Multi: 8704
Single: 632



I came from a 5820K also. While it was a nice chip, and still does it's job in my backup system, it's quite frankly not even in the same league as my 5900X. If your friend prefers the 5820k, he's either doing something wrong or has no idea what he's doing at all.

For Comparison to the above figures (Cinebench R20):

5820k (stock):
Multi: 2207
Single: 318

5820k (4.5Ghz Manual All-Core Overclock):
Multi: 3136
Single: 426

When I went from the 5820k to the 3900X, it didn't seem like much of an upgrade. The single-core scores above reflect that, as they are still pretty close (426 for 5820k vs 512 for 3900X). But the 5900X is a massive bump in comparison. 426 for 5820k vs 632 for 5900X, we're talking about nearly a 50% increase in single-core performance at that point... Not to mention the much higher multi-core performance...
Welp I'm happy for ya. :LOL: Anyhows, I swapped back to the 7700K and retired the 5800X. I might find a sucker to sell it off to. Maybe they would know better than I.

*Edit* Found a sucker. Just sold board, chip and waterblock. :D
 
Last edited:
Just to be fair... here is a benchmark where my sister's 5800X mopped the floor with my 5960X.

[email protected] all-core vs [email protected] in Y-Cruncher. I like this benchmark because not only does it hit the cores, cache, and RAM equally hard, it forces you to test with a 24/7 stable overclock because it'll blue screen or melt your transistors if you try to cheat..

5800X 5960X y-cruncher.jpg
 
Welp I'm happy for ya. Funny how you would assume my buddy is doing something wrong or doesn't know what he's doing. I'm having the same issues, but thanks for that tidbit of your insight. :LOL:

Sounds like both of you are sticking to stubborn old habits regarding manual all-core overclocking - yet at the same time are running into issues getting it to work well. Funny how that works. I had my PBO+Curve optimizer setup within an hour of installing the 5900X. No need to endlessly mess with things. Just set the power limits according to your motherboard (usually a single setting where you just select "motherboard"), and bring the curve optimizer negative offset lower and lower until you become unstable, then raise it a few points. I got a reboot at -30 offset, so I set it to -25 and it's been stable ever since. Eventually I'm going to dig in further and be able to set a separate curve optimizer offset on each core but that is not really necessary to get the majority of the benefit from the curve optimizer.

But the thing is, even if you and/or your friend really don't want to learn a new way of doing things, it still wouldn't matter. Even 100% stock, the 5900X still dominates an overclocked 5820k. So even if you had a golden sample 5820k with a record-setting overclock, it would still not compare to a 100% stock 5900X. Why does your friend prefer the slower CPU? Just butt-hurt that he can't tweak it as much?
 
Sounds like both of you are sticking to stubborn old habits regarding manual all-core overclocking - yet at the same time are running into issues getting it to work well. Funny how that works. I had my PBO+Curve optimizer setup within an hour of installing the 5900X. No need to endlessly mess with things. Just set the power limits according to your motherboard (usually a single setting where you just select "motherboard"), and bring the curve optimizer negative offset lower and lower until you become unstable, then raise it a few points. I got a reboot at -30 offset, so I set it to -25 and it's been stable ever since. Eventually I'm going to dig in further and be able to set a separate curve optimizer offset on each core but that is not really necessary to get the majority of the benefit from the curve optimizer.

But the thing is, even if you and/or your friend really don't want to learn a new way of doing things, it still wouldn't matter. Even 100% stock, the 5900X still dominates an overclocked 5820k. So even if you had a golden sample 5820k with a record-setting overclock, it would still not compare to a 100% stock 5900X. Why does your friend prefer the slower CPU? Just butt-hurt that he can't tweak it as much?
Actually we both are butt hurt plenty, thank you. Yes I'm stubborn (maybe he is too). After months of reading guides and watching videos I got fed the eff up. I tried helping him and he couldn't even get the dam thing stable at any clock. He said he preferred the 5820k as it didn't give the brain tumors the AMD rig did. I'll be more than happy to go back to intel. Yeah i know it's a brand new learning curve, but at least I understand intel more. One thing is this: I want a rig that won't crash, blue screen or reboot whenever it decides to. I'm fed up and that's that. You can say whatever you like and that's your opinion. Remember what they say about opinions. Thanks for info tho.
 
my 7700K @ 5.0Ghz will surpass the 5800X in several benchmarks and vice-versa.

Which benchmark? The highest Cinebench R20 single-core score I've seen for the 7700k was 543, at 5.2Ghz. Pretty amazing actually, but still below a stock 5800X or 5900X. You're obviously not talking about multi-threaded benchmarks while rocking a CPU with only 4 cores.

One thing is this: I want a rig that won't crash, blue screen or reboot whenever it decides to. I'm fed up and that's that.

If it's bringing you that much aggravation, have you ever considered simply running stock? Given that a stock 5800X or 5900X would still out-perform your old CPU. And you're right, everyone has an opinion. That's why I like to analyze actual data - so feel free to post some, like even one single benchmark that puts the 7700k above a 5800X or 5900X.
 
Which benchmark? The highest Cinebench R20 single-core score I've seen for the 7700k was 543, at 5.2Ghz. Pretty amazing actually, but still below a stock 5800X or 5900X. You're obviously not talking about multi-threaded benchmarks while rocking a CPU with only 4 cores.



If it's bringing you that much aggravation, have you ever considered simply running stock? Given that a stock 5800X or 5900X would still out-perform your old CPU. And you're right, everyone has an opinion. That's why I like to analyze actual data - so feel free to post some, like even one single benchmark that puts the 7700k above a 5800X or 5900X.

You AMD fan boys ignore the obvious sometimes...

intel vs amd.png
 
Which benchmark? The highest Cinebench R20 single-core score I've seen for the 7700k was 543, at 5.2Ghz. Pretty amazing actually, but still below a stock 5800X or 5900X. You're obviously not talking about multi-threaded benchmarks while rocking a CPU with only 4 cores.



If it's bringing you that much aggravation, have you ever considered simply running stock? Given that a stock 5800X or 5900X would still out-perform your old CPU. And you're right, everyone has an opinion. That's why I like to analyze actual data - so feel free to post some, like even one single benchmark that puts the 7700k above a 5800X or 5900X.
I never said the 7700K was better than the 5800X. I said I prefer the 7700K over the 5800X. I also said my friend prefers his 5820K over the 5900X. Never once did I say it was better. What's the point of posting benchmarks if the rig doesn't want to cooperate? If I'm gonna run stock, then why purchase a high end (not top end) board? Again, you keep missing my point. Fuck the benchmarks and the numbers. If the rig doesn't respond normally, then the benchmarks are useless. My intel rig does exactly what I want it to do. The AMD rig does not. I had thought I would enjoy my new upgrade, but in actuality it's been nothing but a nightmare. Change parts, swap this, check that. It's all the same. I dial in my intel rig, and I can enjoy it. Dam AMD rig just get's all upset for any little tweak. Last time I had an AMD rig was an FX55 and that didn't give me as much grief as this cursed thing.

So you want me to post benchmarks of what? The AMD rig? :ROFLMAO:
 
I've been having very good results using PBO and Curve Optimizer on my 5900X. The Curve Optimizer is pretty amazing. I had a 3900X before (same motherboard), but the 3900X does not support the Curve Optimizer. PBO worked fine on the 3900X but it felt like the benefit over stock was much less without the curve optimizer. The curve optimizer with the 5900X really helps to maximize what you can get from your boost. I feel that with the curve optimizer, there is little reason to bother with an all-core overclock anymore. And since the best results from curve optimizer actually come from reducing the voltage, not increasing it, it seems to me that degradation is not going to be as big of a factor...

One of the reasons I'm so thrilled with Curve Optimizer is the amazing single-core performance. With PBO enabled, limits set to motherboard (Aorus X570 Ultra), +200 max boost, and Curve Optimizer set to -25 All Core, I'm seeing individual cores boost up to 5100Mhz. That's absolutely amazing for certain games which aren't heavily multi-threaded. It hits 5100Mhz while playing World of Warcraft, which helps tremendously considering that the game is very CPU limited, but not heavily multi-threaded. There are quite a few things where single-core performance still matters a lot. Curve Optimizer with PBO really helps you get the best of both worlds. Setting an All-Core manual overclock is more likely to gimp you at this point, unless you are doing something that maxes out all the cores consistently. You're not going to get an all-core manual overclock of 5100Mhz on Zen3... And locking all cores to 4.7 or whatever is leaving untapped single-core performance on the table.

For Comparison:

In Cinebench R20:

3900X (PBO):
Multi: 7114
Single: 512

5900X (PBO. Curve Optimizer Disabled):
Multi: 8386
Single: 613

5900X (PBO w/ Curve Optimizer -25 All Core):
Multi: 8704
Single: 632
8386 seems very low for PBO on a 5900x. I am typically at 8500 +/- 80 on stock, depending on ambient (temps affect the standard boost clock quite a bit). With plain PBO I normally get around 9000.
 
Welp I'm happy for ya. :LOL: Anyhows, I swapped back to the 7700K and retired the 5800X. I might find a sucker to sell it off to. Maybe they would know better than I.

*Edit* Found a sucker. Just sold board, chip and waterblock. :D
Better get an AMD card if you plan to play newer games cause that 7700k will make anything from nvidia above a 3060ti pointless :)
 
Better get an AMD card if you plan to play newer games cause that 7700k will make anything from nvidia above a 3060ti pointless :)
The 7700K is just a place holder until I decide which new intel chip I want. I'm gonna have to think about an AMD card. AMD has left a real bad taste in my mouth.
 
8386 seems very low for PBO on a 5900x. I am typically at 8500 +/- 80 on stock, depending on ambient (temps affect the standard boost clock quite a bit). With plain PBO I normally get around 9000.

It seems like I'm a bit thermal limited at the moment when all 12 cores are at full load. The 5900X will hit 90C during Cinebench (multi-core), which I believe is the default max temp for the 5900X. That is probably holding my multi-core score back a little bit, and probably part of the reason why I got such good results with the curve optimizer (lower voltage helps a thermal bottleneck). But the scores still seem pretty in-line with the numbers I see others posting on the internet, and my single-core score seems pretty good. I don't have real-world workloads that will load all 12-cores the way cinebench does, so the temps during the multi-core test are not something I'm too worried about at the moment. I think there is some untapped potential here and I look forward to exploring it, but since I've only had the 5900X for about a week I think I'm going to just enjoy it for a while before I dig in further.
 
It seems like I'm a bit thermal limited at the moment when all 12 cores are at full load. The 5900X will hit 90C during Cinebench (multi-core), which I believe is the default max temp for the 5900X. That is probably holding my multi-core score back a little bit, and probably part of the reason why I got such good results with the curve optimizer (lower voltage helps a thermal bottleneck). But the scores still seem pretty in-line with the numbers I see others posting on the internet, and my single-core score seems pretty good. I don't have real-world workloads that will load all 12-cores the way cinebench does, so the temps during the multi-core test are not something I'm too worried about at the moment. I think there is some untapped potential here and I look forward to exploring it, but since I've only had the 5900X for about a week I think I'm going to just enjoy it for a while before I dig in further.
Most likely that is the problem, the boost will go higher with lower temps. When running cinebench I am usually below 60 on stock and around mid 70s with pbo. At 55 degrees the stock boost clock is north of 4300mhz in cinebench while it seems to drop around 50mhz for every 5 degrees of temp. With PBO the Amps will drop, e.g. it will be north of 220A in the low 70s but may drop to somewhere around 190A when the CPU is at 85 degrees etc.

It is quite normal to be in the 8300s on stock with the 5900x with a 360 AIO or a large tower cooler.
 
All of that stock boosting behavior is going to degrade these chips over time. I've already seen a 5800X lose 50MHz in 5 months with no overclocking at all, except for the 2 brief tests I ran at 1.30V fixed. When it was new, it could loop Cinebench at 4.70. 5 months later, it can only do 4.65. Of course, the stock all-core speed/voltage is only 4.45 at 1.35V, so the user would never notice until it got a whole lot worse... how long is the warranty again?
The stock boost clock depends on temp and you would get around 4.6ghz on stock PB in cinebench if your temps are around 70 degrees on the 5800x. It will of course be interesting to see how many people lose a lot of performance on their chips after running PBO constantly at 80-90 degrees over a few years. Personally I stay away from PBO, outside of testing it for fun and "OC" by cooling the chips. E.g. I get around 2-3% extra performance over a 360 AIO on my 5900x in my main rig by just keeping the chip very cool.
 
The stock boost clock depends on temp and you would get around 4.6ghz on stock PB in cinebench if your temps are around 70 degrees on the 5800x. It will of course be interesting to see how many people lose a lot of performance on their chips after running PBO constantly at 80-90 degrees over a few years. Personally I stay away from PBO, outside of testing it for fun and "OC" by cooling the chips. E.g. I get around 2-3% extra performance over a 360 AIO on my 5900x in my main rig by just keeping the chip very cool.

It looks like it hit 72C in R23, PBO off with a 280mm AIO. 4440MHz effective clock. This 5800X may not be the best sample, plus it's slightly degraded. I don't know if the chip has any way of knowing that it degraded....

5800X stock R23.jpg
 
Last edited:
Why is your watts at 114? Mine was at 140w average when I benched the 5800x in my secondary system and it was around 4.6.

Nice catch - Now I'm realizing that was a run I did with a -0.05V offset when I was playing around. Same score and clocks as without, but lower temps for sure.

Unfortunately, I don't have regular access to the machine. When I was over her house, I did a bunch of runs and uploaded them to HWBOT so I could review them later.

I'm surprised that I didn't save any other Cinebench runs. I guess I started to lose interest when I couldn't push the scores up no matter what I did. All I could do was make them worse.
 
8386 seems very low for PBO on a 5900x. I am typically at 8500 +/- 80 on stock, depending on ambient (temps affect the standard boost clock quite a bit). With plain PBO I normally get around 9000.
Definitely. I was seeing full 4.2 (and change) during prime95 in April, now it's more 3.9.
It's starting to drop at around 70 deg in my case.

What I don't get in PBO is how come there is still boost headroom (utilized in low temps) even though Ryzen Master shows the PPT maxed out (and TDC almost maxed out).
 
Nice catch - Now I'm realizing that was a run I did with a -0.05V offset when I was playing around. Same score and clocks as without, but lower temps for sure.

Unfortunately, I don't have regular access to the machine. When I was over her house, I did a bunch of runs and uploaded them to HWBOT so I could review them later.

I'm surprised that I didn't save any other Cinebench runs. I guess I started to lose interest when I couldn't push the scores up no matter what I did. All I could do was make them worse.
Did a run so you can see the boosts I get on my 5800x. It was a budget build so ram in it isn't the greatest so scores are so so.

test2.png
 
Definitely. I was seeing full 4.2 (and change) during prime95 in April, now it's more 3.9.
It's starting to drop at around 70 deg in my case.

What I don't get in PBO is how come there is still boost headroom (utilized in low temps) even though Ryzen Master shows the PPT maxed out (and TDC almost maxed out).
It runs higher clocks at the same voltage when temps are lower. Both regular PB and PBO runs a certain clock with the same voltage depending on temp. Watts is generally a function of Amps and Volts, but clocks are independent of these and depends on what the CPU can do without crashing at a given voltage. Lower temps means it can do higher clocks reliably.
 
I never said the 7700K was better than the 5800X. I said I prefer the 7700K over the 5800X. I also said my friend prefers his 5820K over the 5900X. Never once did I say it was better.

Trust me, it's not. my 7700K @ 5.0Ghz will surpass the 5800X in several benchmarks and vice-versa.

LIES!

Anyway, obviously, an 8 core AMD CPU has less benefit over an 8 core Intel CPU, but then AMD goes and adds more cores that Intel has no answer for. Plus, they are higher binned parts. Intel's highest binned parts have trouble keeping up with AMD's mid binned parts. That's a problem for Intel (among many things). The only reason I'd buy Intel right now is for the IGP.
 
Last edited:
Back
Top