How can i check the maximum power usage of my 4090?

xDiVolatilX

2[H]4U
Joined
Jul 24, 2021
Messages
2,387
I am using Furmark and at 4k gpu stress test I am only pulling 315 W according to HWMonitor

How can I pull the full 450 W? Which program? What settings?

My MSI Gaming X 4090 came with the 3x8 power connector. Not the 4x8 one. Any reason why? What is the max power a 3x8 4090 could/should be pulling on Furmark? Or is there another way to test max power draw/temps?

I'm sitting at 59C right now on 4k stress test on furmark with the 4090 pulling 320W max.

Is this normal or I should be hitting 450? or more?
 
With a 3x8 connector your max power should be 525w because you're forgetting about the 75w it can pull from the PCI-e slot.

It's likely Furmark is only using a portion of the GPU instead of all of it. I have no idea when it was last updated but it is old and probably leaving large portions of the GPU unused. That's why you're not seeing high temps or large power draws. I think I fired up Furmark when I got my 6750xt and saw similar results to yours. The power usage and temps weren't even remotely close to what I was expecting.

Since I'm still stuck at 1080p for the moment I can't even challenge a 6750xt graphically but the one program I have used which tends to max out the GPU or at least get close to it is Folding@Home, a distributed computing project. So far that's the only thing which has pushed my GPU power and temps up.

Also, just because the power connectors for the card indicate a certain maximum wattage, it doesn't mean the card can or will pull that much wattage. There are other things which can limit it. My old Sapphire Radeon RX570 has an 8 pin connector meaning it can use up to 225w. However, the stock power limit is 125w and I never pushed it over 150w. And F@H was the only thing which would make it get close or hit the max power usage. Nothing graphical ever pushed the temps or power usage anywhere near what F@H would do.

There may be some graphical benchmark and stress test software which will push the GPU to max but I don't know what programs they are anymore.
 
Try different resolutions and AA settings, Furmark will only cause a GPU to hit max power usage with certain combinations and it's different for each GPU depending on stuff like shader compute power, fillrate, and bandwidth. IME lower resolutions and/or no AA will drive power consumption higher.

edit to add- I just checked on my shuntmodded 3090Ti- Furmark 4K with MSAA only hit around 360W at 99% usage while 4K with no AA ran up to the 560W power limit. Make sure you have AA disabled, or do max 4X AA at a low resolution like 540P and see if power consumption is higher. Does HWMonitor say your 4090 is hitting a Power Limit? (PerfCap for Pwr)
 
Last edited:
Thanks for the responses guys. I pulled over 400W on time spy extreme so it does ramp up. I'm curious what the max is though. I've I pull higher I'll report back here.
Also Little buddy do you have the 3x8 connection or 4x8 one?
 
I have a TUF 4090 and initially I had only three of the four plugs on the adapter plugged in.

I didn't check power draw, but I could clock both mem and core a good deal higher without apps crashing when I finally got around to adding a fourth GPU power cable to the adapter.

The highest power draw I see now is around 520-530w so I would recommend ordering a 4x8 pin adapter if you want take full advantage of the card.
 
With a 3x8 connector your max power should be 525w because you're forgetting about the 75w it can pull from the PCI-e slot.

It's likely Furmark is only using a portion of the GPU instead of all of it. I have no idea when it was last updated but it is old and probably leaving large portions of the GPU unused. That's why you're not seeing high temps or large power draws. I think I fired up Furmark when I got my 6750xt and saw similar results to yours. The power usage and temps weren't even remotely close to what I was expecting.

Since I'm still stuck at 1080p for the moment I can't even challenge a 6750xt graphically but the one program I have used which tends to max out the GPU or at least get close to it is Folding@Home, a distributed computing project. So far that's the only thing which has pushed my GPU power and temps up.

Also, just because the power connectors for the card indicate a certain maximum wattage, it doesn't mean the card can or will pull that much wattage. There are other things which can limit it. My old Sapphire Radeon RX570 has an 8 pin connector meaning it can use up to 225w. However, the stock power limit is 125w and I never pushed it over 150w. And F@H was the only thing which would make it get close or hit the max power usage. Nothing graphical ever pushed the temps or power usage anywhere near what F@H would do.

There may be some graphical benchmark and stress test software which will push the GPU to max but I don't know what programs they are anymore.
I don't believe that first statement is true. I don't have a 4090 to test, but from testing my own 4080 - These cards no longer draw much power from the slot itself. It's almost all coming from the external plugs. So a 4090 with only 3 plugs should only be maxing at 450w-ish, which matches Nvidia's claims as well that when only 3 plugs are sensed the card gets power limited to 450w (3 * 150w PCI-E External).

On my 4080 at full load i'm not seeing more than a couple watts pulled from the slot. It's all from the external power source. In fact - I see more draw out of the slot when the card is idling. (7w, versus 2.5w at full load) This is why the 4080 will refuse to run off only 2 power plugs even though 2 plugs and the slot would be enough for it's 320w max. It can't/won't due to design use the slot for any real power needs.
 
Last edited:
Side question that is very relevant, where are all the 4x8pin connectors? I can't find any not even cablemod? No way for me to get a good power supply and even if I got a pci e 4 one there are no 4x8pin cables that could be bought?
I have a 4090 and a 13900KS so a 1000 is cutting it too close I want a 1500 but there is none?
And no 16pin x 4x 8pin cables either? Let alone a dedicated one.

Help?
 
Last edited:
Some cards are golden samples. I’ve seen some channels test multiple 4090s and some cards will pull 450W and others pull as little 320W on the same workload at stock voltages, without even needing to undervolt. (I.e., there’s even more gains possible from undervolting).
 
I am using Furmark and at 4k gpu stress test I am only pulling 315 W according to HWMonitor

How can I pull the full 450 W? Which program? What settings?
I use GPU-Z

1674581707767.png


Click in the field where the value is displayed, to toggle between current reading (default), min, max, avg
If the performance is good, I wouldn't worry about it. Try 3dmark Timespy Extreme to put a full load. Run the GPU-Z and also set the "GPU Load" to max, so you can see how hard the Furmark pushes it.

I've found that exceeding the power target , even all of the way to 133%, only added 1 or 2 % performance. For my card the 100% pwr target is good. 95% only loses 0.9% performance, the efficiency sweet spot.
 
I've found that exceeding the power target , even all of the way to 133%, only added 1 or 2 % performance.
I did some testing in Metro Exodus, my cards seeing 4% for a 600W limit. My guess is when more games utilize RT completely the 133PL will be more useful, even Metro can't completely pull 600W on my card, maybe with a better CPU. The highest I've seen it pull is 589W.

Power Limiting
Core OCVRAM OCMax VoltagePower LimitBenchmarkScoreAvg. FPSPeak WattsPerf Difference (PL133)Perf Difference (Stock Performance)
StockStockStockStockMetro Exodus Enhanced 4K Extreme795876.01441.82-8.04%Baseline
13516001100 mV133Metro Exodus Enhanced 4K Extreme865482.56582.65Baseline8.75%
13516001100 mV100Metro Exodus Enhanced 4K Extreme828779.13441.84-4.24%4.13%
13516001100 mV90Metro Exodus Enhanced 4K Extreme810377.44402.69-6.36%1.82%
13516001100 mV80Metro Exodus Enhanced 4K Extreme782674.81361.34-9.57%-1.65%
13516001100 mV70Metro Exodus Enhanced 4K Extreme739070.71315.69-14.61%-7.14%
13516001100 mV60Metro Exodus Enhanced 4K Extreme637061.04269.98-26.39%-19.95%
 
Last edited:
In Afterburner, for the Core voltage, are the units percent?
It can be adjusted 0 to 100, but double core volt age seems dangerous. Anyone have information on what is typically safe for air cooled 4090's?

I don't think I would go more than 10%.
 
In Afterburner, for the Core voltage, are the units percent?
It can be adjusted 0 to 100, but double core volt age seems dangerous. Anyone have information on what is typically safe for air cooled 4090's?

I don't think I would go more than 10%.
100 will give you 1.1V maximum on the card and yes it's safe.
 
Can I run furmark and say Cinebench at the same time to test maximum power output of the entire system? Or which 2 programs in combination would work best for max power? I'm getting ready to test the kill a watt meter on my 1050W PC power & Cooling Firestorm Gold PS.
 
Some cards are golden samples. I’ve seen some channels test multiple 4090s and some cards will pull 450W and others pull as little 320W on the same workload at stock voltages, without even needing to undervolt. (I.e., there’s even more gains possible from undervolting).
I saw a peak of like 453 while gaming the other night. I had the power in afterburner set to 106% with the included triple to 16pin connection that comes with the 4090 Gaming X
 
I suppose you could play a PC Game that uses your GPU or overclock GPU for higher Watts.

Took screen grab from the PC Game Scorn was at 555Watts 97% GPU usage.I have seen 600Watts on different RTX 4090 cards I own but also seen the cards can run better than 3080/3090 cards on less watts that I own.
Untitled.png
 
Back
Top