4090 almost no performance loss at 75% power limit sweet spot

Yeah I had messed with reducing the PL when I got my Suprim 4090. At 90% PL, I did a OC of +150 and +1000 memory and saw around a 5-7% performance boost (depending on benchmark) while consuming 50W less power at 400W peak. Not sure if it was 100% stable there, but gamed for a few hours without issues playing the Dead Space remake. Have not started playing with the voltage yet though.
 
Last edited:
I get why people have done this in the past with temps, or if rendering for work... but for gaming on these cards, whats the point? I maxed my power slider and voltage to the max, with 3045Mhz core and +1250mhz on the memory and it never gets above 54C gaming for hours on end at 4K. Even at this, it rarely goes above 450W and uses less power than my 3090 did... lol. TBH, as cool as it stays, I'd push it more if it let me and actually used it... 😆
 
I get why people have done this in the past with temps, or if rendering for work... but for gaming on these cards, whats the point? I maxed my power slider and voltage to the max, with 3045Mhz core and +1250mhz on the memory and it never gets above 54C gaming for hours on end at 4K. Even at this, it rarely goes above 450W and uses less power than my 3090 did... lol. TBH, as cool as it stays, I'd push it more if it let me and actually used it... 😆

I’ve done it slightly to reduce noise (or more set a max temp target / fan curve) but yeah on these cards that are quiet and cost so much I fail to see the reason.

It’s basically only useful to counter power usage complaints by non-owners. I’d be surprised to find an owner who cared on a card that’s almost $2k.
 
I get why people have done this in the past with temps, or if rendering for work... but for gaming on these cards, whats the point? I maxed my power slider and voltage to the max, with 3045Mhz core and +1250mhz on the memory and it never gets above 54C gaming for hours on end at 4K. Even at this, it rarely goes above 450W and uses less power than my 3090 did... lol. TBH, as cool as it stays, I'd push it more if it let me and actually used it... 😆
It's because on particular games that I don't need all the power I can reduce the voltage to not put any additional voltage/heat/power into the card for no reason. Even though mine is watercooled I still don't need to heat up my room or abuse the card when I am already hitting my target 144hz and well above it. If you're not hitting your monitors target fps or you have it uncapped that is different but I always run with gsync/vsyc 144hz on 4k.
 
I’ve done it slightly to reduce noise (or more set a max temp target / fan curve) but yeah on these cards that are quiet and cost so much I fail to see the reason.

It’s basically only useful to counter power usage complaints by non-owners. I’d be surprised to find an owner who cared on a card that’s almost $2k.
A lot of people don't want the extra heat. For me it's a big factor. Also keeping the card cooler with less voltage is generally better for it's lifespan if you plan on keeping it for years milking it.
You don't have to you can run it 100% or over if you want but if you are already at the fps limit and/or want to keep the rig/card/room cooler etc the option is there and quite luxurious if you use it to your favor.
 
It's because on particular games that I don't need all the power I can reduce the voltage to not put any additional voltage/heat/power into the card for no reason. Even though mine is watercooled I still don't need to heat up my room or abuse the card when I am already hitting my target 144hz and well above it. If you're not hitting your monitors target fps or you have it uncapped that is different but I always run with gsync/vsyc 144hz on 4k.
Thats the thing tho, in games I have even at 4k that are not pushing the card hard, the power usage is well below 400W to begin with, sometimes 350W or thereabouts. Id think on these cards, you would only be limiting based on if you are hitting your monitors refresh rate, but just use an FPS cap and that keeps the draw down too I have found. I am in a similar boat with 4K gsync 144hz.

A lot of people don't want the extra heat. For me it's a big factor. Also keeping the card cooler with less voltage is generally better for it's lifespan if you plan on keeping it for years milking it.
I can see the heat for some people, I have yet to use this card in summer, but so far, my 3090 ran waaaayyyyyy hotter.

However, ive never heard of lifespan ever being an issue on any modern video card. I've pushed every single one over the last decade at max power, voltage and overclock and they all still work at the same settings I have always used. I always get the flagship card, but the ones I had prior get used in other PCs and are run at the same settings. My 3090 is on my entertainment system now and my 2080Ti is in my Wife's PC, all still work the same. Maybe if I ran it that way for 10+ years? I dunno. In absence of voltage modding, maxing sliders out is still well within nvidia spec anyway.
 
It's because on particular games that I don't need all the power I can reduce the voltage to not put any additional voltage/heat/power into the card for no reason. Even though mine is watercooled I still don't need to heat up my room or abuse the card when I am already hitting my target 144hz and well above it. If you're not hitting your monitors target fps or you have it uncapped that is different but I always run with gsync/vsyc 144hz on 4k.

I just vsync (gsync) to 75/100/144 depending on the game, capping it at those framerstes. I usually adjust settings to about 70-80% usage so my minimums stay consistent. Then it had the PL when needed here and there. Just food for thought….
 
Thats the thing tho, in games I have even at 4k that are not pushing the card hard, the power usage is well below 400W to begin with, sometimes 350W or thereabouts. Id think on these cards, you would only be limiting based on if you are hitting your monitors refresh rate, but just use an FPS cap and that keeps the draw down too I have found. I am in a similar boat with 4K gsync 144hz.


I can see the heat for some people, I have yet to use this card in summer, but so far, my 3090 ran waaaayyyyyy hotter.

However, ive never heard of lifespan ever being an issue on any modern video card. I've pushed every single one over the last decade at max power, voltage and overclock and they all still work at the same settings I have always used. I always get the flagship card, but the ones I had prior get used in other PCs and are run at the same settings. My 3090 is on my entertainment system now and my 2080Ti is in my Wife's PC, all still work the same. Maybe if I ran it that way for 10+ years? I dunno. In absence of voltage modding, maxing sliders out is still well within nvidia spec anyway.
Well I've had a few cards die on me before. I can't say what the exact cause of it was but perhaps it was from heat? Yea I keep the normal vsync/gsync on 4k at 144hz at all time. I don't let it go above that, never uncapped. The thing about the 75% power limit is more like a hard cap for the power at all times. You only lose less than 5% of performance in exchange for 25% less power draw. That's a hell of a fair trade for me since I still don't use the maximum capabilities of my 4090. Even my 13900KS i keep it undervolted and turn off some cores because I don't need the extra power/oc yet. It feels nice to undervolt the entire system and still hit 144hz max settings in all the games I play. Even if you have a fps capped limit, the voltage isn't always lower even if the power draw is lower. Voltage is the biggest heat producer. When you undervolt the cpu/gpu it guarantees the lower voltage and thus cool/lower temps inside and outside of the case/room.
 
I just vsync (gsync) to 75/100/144 depending on the game, capping it at those framerstes. I usually adjust settings to about 70-80% usage so my minimums stay consistent. Then it had the PL when needed here and there. Just food for thought….
I try to keep it at 144 on all games. I even mod the files on all my Batman Arkham games to 144 from the native 60 because It's so much nicer with the higher refresh. Interestingly enough, the Batman games push my 4090 quite a lot at almost 80% usage. For such an old game I am surprised lol. Can still easily get away with the 75% power limit.
 
I was running a lower power target when I first got my 4090 but I cranked it up to its default settings again. These cards aren’t drawing nearly as much power as I thought they would and they are among the coolest running GPUs I’ve ever owned. Outside of benchmarks and stress tests it’s pulling a little more power than a 3080 and about the same as a 3080Ti. That combined with its massive cooler and I don’t think longevity of the card due to heat related failures is really going to be a problem.
 
I try to keep it at 144 on all games. I even mod the files on all my Batman Arkham games to 144 from the native 60 because It's so much nicer with the higher refresh. Interestingly enough, the Batman games push my 4090 quite a lot at almost 80% usage. For such an old game I am surprised lol. Can still easily get away with the 75% power limit.
You want to cap at 2-3fps below refresh rate. Honestly though if you have gsync set up properly (gsync on vsync on in nvidia CP/vsync off in game) + low latency at ultra you typically don't even need to cap as it stays several fps below refresh rate anyhow.
 
A lot of people don't want the extra heat. For me it's a big factor. Also keeping the card cooler with less voltage is generally better for it's lifespan if you plan on keeping it for years milking it.
You don't have to you can run it 100% or over if you want but if you are already at the fps limit and/or want to keep the rig/card/room cooler etc the option is there and quite luxurious if you use it to your favor.
Yeah, the heat these things dump into the room it's in is no joke.
 
You want to cap at 2-3fps below refresh rate. Honestly though if you have gsync set up properly (gsync on vsync on in nvidia CP/vsync off in game) + low latency at ultra you typically don't even need to cap as it stays several fps below refresh rate anyhow.
Yea it naturally stays a few fps beneath the 144 cap with V-sync and G-sync on. Although low latency mode from the Nvidia control panel caused a bunch of studdering issues where the in game low latency mode is perfect.
 
Yea it naturally stays a few fps beneath the 144 cap with V-sync and G-sync on. Although low latency mode from the Nvidia control panel caused a bunch of studdering issues where the in game low latency mode is perfect.
That's odd. With top end HW you shouldn't get stutter with that setting. Could just be that game though.
 
That's odd. With top end HW you shouldn't get stutter with that setting. Could just be that game though.
Thought that specific mode in the NVidia control panel was for DX11 games only? If you are on DX12, it handles the frame buffering and NVidia reflex within a game is what handles all that with DX12...
 
Thought that specific mode in the NVidia control panel was for DX11 games only? If you are on DX12, it handles the frame buffering and NVidia reflex within a game is what handles all that with DX12...
Yes I'm pretty sure that's right though a bunch of the batman games aren't dx12 right?
 
Thought that specific mode in the NVidia control panel was for DX11 games only? If you are on DX12, it handles the frame buffering and NVidia reflex within a game is what handles all that with DX12...
Yeah the in game reflex mode works good. It's the Nvidia control panels low latency ultra mode that is buggy for me.
 
I have never in my life ever ever ever felt the heat in my apartment from a video card? What do you guys live in shoeboxes? I have had my pc on full tilt in a one bedroom apartment and have been freezing my balls off. Where do you guys live on the Sun?

Who buys a 2 THOUSAND DOLLAR video card and goes gee I can't wait to save money on electricity and have 0 heat output cause where I live I feel the tiniest of thermal changes?????

Is this a real argument? I have built high end rigs since 1990 and never once even thought about electricity cost or thermals that could affect an entire room??? What?! I have personally never felt any change in temp anywhere ever from my pc.
 
Last edited:
I have never in my life ever ever ever felt the heat in my apartment from a video card? What do you guys live in shoeboxes? I have had my pc on full tilt in a one bedroom apartment and have been freezing my balls off. Where do you guys live on the Sun?

Who buys a 2 THOSAND DOLLAR video card and goes gee I can't wait to save money on electricity and have 0 heat output cause where I live I feel the tiniest of thermal changes?????

Is this a real argument? I have built high end rigs since 1990 and never once even thought about electricity cost or thermals that could affect a entire room??? What?! I have personally never felt any change in temp anywhere ever from my pc.
Not gonna lie... my gaming room / office is on the 2nd floor of my house, and it will get warm in the summer when gaming full load, but I don't buy a $2K video card and care about the costs. What did I do? I bought a portable AC for just that room and run it when i game in the summer... lol. MAX overlock and power settings or bust! :D I also get emails every month from my power company saying I'm using 200% or more electricity a month than my neighbors... IDGAF. :ROFLMAO:
 
Not gonna lie... my gaming room / office is on the 2nd floor of my house, and it will get warm in the summer when gaming full load, but I don't buy a $2K video card and care about the costs. What did I do? I bought a portable AC for just that room and run it when i game in the summer... lol. MAX overlock and power settings or bust! :D I also get emails every month from my power company saying I'm using 200% or more electricity a month than my neighbors... IDGAF. :ROFLMAO:
Thats funny because National Grid sends me emails telling me I'm 20% under power usage than my neighbors and I run a 3090 at full tilt overclocked. My electric bill is no more than 60$ per month here in Worcester MA. You guys must be using the fuck outta your utilities to go over 200% than your neighbors. Holy shit! From what I hear about power usage on here you would think you are tapping into the main grid juicing the shit out of the power with these video cards. That is simply not true at all.
 
Thats funny because National Grid sends me emails telling me I'm 20% under power usage than my neighbors and I run a 3090 at full tilt overclocked. My electric bill is no more than 60$ per month here in Worcester MA. You guys must be using the fuck outta your utilities to go over 200% than your neighbors. Holy shit! From what I hear about power usage on here you would think you are tapping into the main grid juicing the shit out of the power with these video cards. That is simply not true at all.
In all fairness, my house is older and the insulation sucks upstairs so it gets pretty toasty in the summer with a PC pumping out heat too. The Wife and I like it cool, so the AC runs almost 24/7 in the summer anyway to keep it at 66F, then add a portable AC that runs to keep my office cool when I am using it and it starts to add up with everything else that uses power. The number of +200% sounds big, but the cost is less than what I would use if I had to drive to work everyday and get gas, so its still a win for me with WFH.
 
Well do whatever makes you happy that's important. When I buy a video card heat and power usage doesn't even register in my brain. All I care about is performance.
 
I care about the extra heat during the summer, and the noise needed to cool it. So power and heat are definitely aspects of the decision for me. :)
Absolutely agree with this GoldenTiger. Word for word. Heat in the summer and the noise needed to cool it mean everything because performance is a given, heat and noise are not lol 💯
 
When I was young, I was aggressive. I can run and jump. Really loved overclock to boot a bit performance.
Now I'm old, I am conservative. I enjoy layback and sleep. I'm more than happy to undervolt to get more efficiency.

Joke aside, this really depends on the technology development.
The ASIC technology years go, such as 10nm or 14nm even older, has potential to overclock.
Now tech is reduced to 5nm, 4nm, and keeping thinner. The headroom for higher voltage is also thinner. Better to take efficiency direction.

Just my 2c.
 
Last edited:
I have never in my life ever ever ever felt the heat in my apartment from a video card? What do you guys live in shoeboxes? I have had my pc on full tilt in a one bedroom apartment and have been freezing my balls off. Where do you guys live on the Sun?

Who buys a 2 THOUSAND DOLLAR video card and goes gee I can't wait to save money on electricity and have 0 heat output cause where I live I feel the tiniest of thermal changes?????

Is this a real argument? I have built high end rigs since 1990 and never once even thought about electricity cost or thermals that could affect an entire room??? What?! I have personally never felt any change in temp anywhere ever from my pc.
A box that is pumping out 600W of heat is going to affect the temperature of the room it's running in, regardless. I don't care so much about the power usage as I care about managing the room temperature. I don't enjoy swimming in ball soup.
 
I have never in my life ever ever ever felt the heat in my apartment from a video card? What do you guys live in shoeboxes? I have had my pc on full tilt in a one bedroom apartment and have been freezing my balls off. Where do you guys live on the Sun?

Who buys a 2 THOUSAND DOLLAR video card and goes gee I can't wait to save money on electricity and have 0 heat output cause where I live I feel the tiniest of thermal changes?????

Is this a real argument? I have built high end rigs since 1990 and never once even thought about electricity cost or thermals that could affect an entire room??? What?! I have personally never felt any change in temp anywhere ever from my pc.
Maybe not everyone lives in their parents unfinished basement. /s
 
Maybe not everyone lives in their parents unfinished basement. /s
your generation surely does love running its mouth. I worked hard my whole life and did it on my own and I'm not some f****** teenager. But yet you talk as if you know me unreal.

Anyways to the other posters in this thread not the prick that just posted. I just bought a 4090 and I noticed no difference in temperatures in my apartment. Anyways good luck with whatever you buy.
 
your generation surely does love running its mouth. I worked hard my whole life and did it on my own and I'm not some f****** teenager. But yet you talk as if you know me unreal.
He was being sarcastic. Also, if you are not a teen, why did you asterix out the f word? Only teens tend to do that... note I did not put a "/s" on this as the other poster did.
 
Yeah I didn't see the sarcastic mark. I just don't talk s*** regardless if it's sarcasm or not. I'm talking into my phone that's why the curse words are being outed with asterix. I have been on this site for nearly 20 years so the de facto truth and logic would be I'm not a teenager at all. I'm actually 48 years old. Anyways have a good day take care.
 
Yeah I didn't see the sarcastic mark. I just don't talk s*** regardless if it's sarcasm or not. I'm talking into my phone that's why the curse words are being outed with asterix. I have been on this site for nearly 20 years so the de facto truth and logic would be I'm not a teenager at all. I'm actually 48 years old. Anyways have a good day take care.
Your mom clearly registered an account for you before you were born. :)
 
I literally do not get the heat is important in my house gang or the electricity bill is too high gang when buying a $2,000 video card that is meant for strictly gaming. I live in Massachusetts and need to turn the heat up every day because it's so freaking cold. These cards run cool as s***. Is anyone actually complaining that the 4090 is dumping loads of heat into their house making it hot? Do you guys live in the Sahara desert? I owned a 3090 and now on a 4090 and cannot heat up my house enough with the gas heat I have installed in my apartment. Please bring me your 4090 so I can heat up my house.
 
I literally do not get the heat is important in my house gang or the electricity bill is too high gang when buying a $2,000 video card that is meant for strictly gaming. I live in Massachusetts and need to turn the heat up every day because it's so freaking cold. These cards run cool as s***. Is anyone actually complaining that the 4090 is dumping loads of heat into their house making it hot? Do you guys live in the Sahara desert?
The upstairs bedroom I have my PC in got noticeably hotter when running high wattage cards pumping out 600+w system total wattage. Think about it, it's like a space heater... I have to run ac more to make up for it. I'm not complaining about that part really do much as the heat in the first place.
 
The upstairs bedroom I have my PC in got noticeably hotter when running high wattage cards pumping out 600+w system total wattage. Think about it, it's like a space heater... I have to run ac more to make up for it. I'm not complaining about that part really do much as the heat in the first place.
Yeah man that sucks that you noticed the heat difference. I can't heat up my apartment enough. I literally notice absolutely no difference between my 3090 or 4090 here in massachusetts. I can't feel the heat at all. Like an icebox right now.
 
Yeah man that sucks that you noticed the heat difference. I can't heat up my apartment enough. I literally notice absolutely no difference between my 3090 or 4090 here in massachusetts. I can't feel the heat at all. Like an icebox right now.
Nice... Yeah my bedroom is on a third floor so it gets warm up there when it isn't winter. I also am in the northeast :).
 
These cards run cool as s***
That could be more because of the cooling solution more than the amount of watt they dumb in the room, look at power usage and you have the actual amount of heat not temperature. To have an idea how different heat and temperature can be, when you put your hand in a oven the temperature is higher than in boiling water, it does not feel that way (way more heat-energy is transferred to your hand in boiling water despite a significantly lower temp)

Yeah man that sucks that you noticed the heat difference. I can't heat up my apartment enough. I literally notice absolutely no difference between my 3090 or 4090 here in massachusetts. I can't feel the heat at all. Like an icebox right now.
Obviously no one mind the heat during cold temperature, this is trivial. Do you not know that a lot of people have AC system in their house because they sometime feel it is too hot ?
 
Even in the hot months I don't notice the Heat though. And I've been building PCS since the 1990s. Maybe my body type is just indifferent to temperature changes I can understand people being sensitive I guess I'm blessed that I don't notice stuff like this. Also I don't notice text abnormalities on television minitors as much as people seem to do here. I'm glad that I don't see these things makes me happier as a technician I guess and a technical person at that also.
 
Some heater have 750 watt mode when put to low:
https://honeywellpluggedin.ca/heaters/honeywell-hce317bc-slim-ceramic-tower

That not that different from a 4090 with a 13900k system with the monitor, imagine having a small heater going on in a smallish room during a hot summer day, obviously if you have an AC going on strong enough it is possible to not notice the temperature change (has they would not be one) it will just cost more than usual to run it.

It really depend where you live has well when talking cost, in California the average is around 30 cent a kilowatt-hours, if you are to game 1,000 hours during a video card (1 hours a days in average for 3 years) thats $30 usd by 100 watt of the video card before the added clim cost, in Uk/Germany price got a bit crazy in 2022, they saw 50-60 cents. Imagine a 3,000 hours gamers, a 200 watt difference could end up costing what nearly $200 (not sure how to calculate the strain on the clim, does it more than double the cost during most of the year ?) our brain feel it is little because of the distribution over time, even if a much lower $100 on the price tag when we buy would be considered.

In colder part of the world with low electricity price, those elements are barely factor if at all, in some euro country, US states it make some sense to think about it, maybe not a big deal but if it is close can be a tiebreaker.

And some people with gamers kids that inherit dad hardware over time, 2-3 system running at the same time, can start to add up.
 
Last edited:
  • Like
Reactions: xx0xx
like this
Back
Top