xDiVolatilX
[H]ard|Gawd
- Joined
- Jul 24, 2021
- Messages
- 1,863
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
I get why people have done this in the past with temps, or if rendering for work... but for gaming on these cards, whats the point? I maxed my power slider and voltage to the max, with 3045Mhz core and +1250mhz on the memory and it never gets above 54C gaming for hours on end at 4K. Even at this, it rarely goes above 450W and uses less power than my 3090 did... lol. TBH, as cool as it stays, I'd push it more if it let me and actually used it...![]()
It's because on particular games that I don't need all the power I can reduce the voltage to not put any additional voltage/heat/power into the card for no reason. Even though mine is watercooled I still don't need to heat up my room or abuse the card when I am already hitting my target 144hz and well above it. If you're not hitting your monitors target fps or you have it uncapped that is different but I always run with gsync/vsyc 144hz on 4k.I get why people have done this in the past with temps, or if rendering for work... but for gaming on these cards, whats the point? I maxed my power slider and voltage to the max, with 3045Mhz core and +1250mhz on the memory and it never gets above 54C gaming for hours on end at 4K. Even at this, it rarely goes above 450W and uses less power than my 3090 did... lol. TBH, as cool as it stays, I'd push it more if it let me and actually used it...![]()
A lot of people don't want the extra heat. For me it's a big factor. Also keeping the card cooler with less voltage is generally better for it's lifespan if you plan on keeping it for years milking it.I’ve done it slightly to reduce noise (or more set a max temp target / fan curve) but yeah on these cards that are quiet and cost so much I fail to see the reason.
It’s basically only useful to counter power usage complaints by non-owners. I’d be surprised to find an owner who cared on a card that’s almost $2k.
Thats the thing tho, in games I have even at 4k that are not pushing the card hard, the power usage is well below 400W to begin with, sometimes 350W or thereabouts. Id think on these cards, you would only be limiting based on if you are hitting your monitors refresh rate, but just use an FPS cap and that keeps the draw down too I have found. I am in a similar boat with 4K gsync 144hz.It's because on particular games that I don't need all the power I can reduce the voltage to not put any additional voltage/heat/power into the card for no reason. Even though mine is watercooled I still don't need to heat up my room or abuse the card when I am already hitting my target 144hz and well above it. If you're not hitting your monitors target fps or you have it uncapped that is different but I always run with gsync/vsyc 144hz on 4k.
I can see the heat for some people, I have yet to use this card in summer, but so far, my 3090 ran waaaayyyyyy hotter.A lot of people don't want the extra heat. For me it's a big factor. Also keeping the card cooler with less voltage is generally better for it's lifespan if you plan on keeping it for years milking it.
It's because on particular games that I don't need all the power I can reduce the voltage to not put any additional voltage/heat/power into the card for no reason. Even though mine is watercooled I still don't need to heat up my room or abuse the card when I am already hitting my target 144hz and well above it. If you're not hitting your monitors target fps or you have it uncapped that is different but I always run with gsync/vsyc 144hz on 4k.
Well I've had a few cards die on me before. I can't say what the exact cause of it was but perhaps it was from heat? Yea I keep the normal vsync/gsync on 4k at 144hz at all time. I don't let it go above that, never uncapped. The thing about the 75% power limit is more like a hard cap for the power at all times. You only lose less than 5% of performance in exchange for 25% less power draw. That's a hell of a fair trade for me since I still don't use the maximum capabilities of my 4090. Even my 13900KS i keep it undervolted and turn off some cores because I don't need the extra power/oc yet. It feels nice to undervolt the entire system and still hit 144hz max settings in all the games I play. Even if you have a fps capped limit, the voltage isn't always lower even if the power draw is lower. Voltage is the biggest heat producer. When you undervolt the cpu/gpu it guarantees the lower voltage and thus cool/lower temps inside and outside of the case/room.Thats the thing tho, in games I have even at 4k that are not pushing the card hard, the power usage is well below 400W to begin with, sometimes 350W or thereabouts. Id think on these cards, you would only be limiting based on if you are hitting your monitors refresh rate, but just use an FPS cap and that keeps the draw down too I have found. I am in a similar boat with 4K gsync 144hz.
I can see the heat for some people, I have yet to use this card in summer, but so far, my 3090 ran waaaayyyyyy hotter.
However, ive never heard of lifespan ever being an issue on any modern video card. I've pushed every single one over the last decade at max power, voltage and overclock and they all still work at the same settings I have always used. I always get the flagship card, but the ones I had prior get used in other PCs and are run at the same settings. My 3090 is on my entertainment system now and my 2080Ti is in my Wife's PC, all still work the same. Maybe if I ran it that way for 10+ years? I dunno. In absence of voltage modding, maxing sliders out is still well within nvidia spec anyway.
I try to keep it at 144 on all games. I even mod the files on all my Batman Arkham games to 144 from the native 60 because It's so much nicer with the higher refresh. Interestingly enough, the Batman games push my 4090 quite a lot at almost 80% usage. For such an old game I am surprised lol. Can still easily get away with the 75% power limit.I just vsync (gsync) to 75/100/144 depending on the game, capping it at those framerstes. I usually adjust settings to about 70-80% usage so my minimums stay consistent. Then it had the PL when needed here and there. Just food for thought….
You want to cap at 2-3fps below refresh rate. Honestly though if you have gsync set up properly (gsync on vsync on in nvidia CP/vsync off in game) + low latency at ultra you typically don't even need to cap as it stays several fps below refresh rate anyhow.I try to keep it at 144 on all games. I even mod the files on all my Batman Arkham games to 144 from the native 60 because It's so much nicer with the higher refresh. Interestingly enough, the Batman games push my 4090 quite a lot at almost 80% usage. For such an old game I am surprised lol. Can still easily get away with the 75% power limit.
Yeah, the heat these things dump into the room it's in is no joke.A lot of people don't want the extra heat. For me it's a big factor. Also keeping the card cooler with less voltage is generally better for it's lifespan if you plan on keeping it for years milking it.
You don't have to you can run it 100% or over if you want but if you are already at the fps limit and/or want to keep the rig/card/room cooler etc the option is there and quite luxurious if you use it to your favor.
Yea it naturally stays a few fps beneath the 144 cap with V-sync and G-sync on. Although low latency mode from the Nvidia control panel caused a bunch of studdering issues where the in game low latency mode is perfect.You want to cap at 2-3fps below refresh rate. Honestly though if you have gsync set up properly (gsync on vsync on in nvidia CP/vsync off in game) + low latency at ultra you typically don't even need to cap as it stays several fps below refresh rate anyhow.
That's odd. With top end HW you shouldn't get stutter with that setting. Could just be that game though.Yea it naturally stays a few fps beneath the 144 cap with V-sync and G-sync on. Although low latency mode from the Nvidia control panel caused a bunch of studdering issues where the in game low latency mode is perfect.
Thought that specific mode in the NVidia control panel was for DX11 games only? If you are on DX12, it handles the frame buffering and NVidia reflex within a game is what handles all that with DX12...That's odd. With top end HW you shouldn't get stutter with that setting. Could just be that game though.
Yes I'm pretty sure that's right though a bunch of the batman games aren't dx12 right?Thought that specific mode in the NVidia control panel was for DX11 games only? If you are on DX12, it handles the frame buffering and NVidia reflex within a game is what handles all that with DX12...
Yeah the Batman games are DX11.Yes I'm pretty sure that's right though a bunch of the batman games aren't dx12 right?
Yeah the in game reflex mode works good. It's the Nvidia control panels low latency ultra mode that is buggy for me.Thought that specific mode in the NVidia control panel was for DX11 games only? If you are on DX12, it handles the frame buffering and NVidia reflex within a game is what handles all that with DX12...
Not gonna lie... my gaming room / office is on the 2nd floor of my house, and it will get warm in the summer when gaming full load, but I don't buy a $2K video card and care about the costs. What did I do? I bought a portable AC for just that room and run it when i game in the summer... lol. MAX overlock and power settings or bust!I have never in my life ever ever ever felt the heat in my apartment from a video card? What do you guys live in shoeboxes? I have had my pc on full tilt in a one bedroom apartment and have been freezing my balls off. Where do you guys live on the Sun?
Who buys a 2 THOSAND DOLLAR video card and goes gee I can't wait to save money on electricity and have 0 heat output cause where I live I feel the tiniest of thermal changes?????
Is this a real argument? I have built high end rigs since 1990 and never once even thought about electricity cost or thermals that could affect a entire room??? What?! I have personally never felt any change in temp anywhere ever from my pc.
Thats funny because National Grid sends me emails telling me I'm 20% under power usage than my neighbors and I run a 3090 at full tilt overclocked. My electric bill is no more than 60$ per month here in Worcester MA. You guys must be using the fuck outta your utilities to go over 200% than your neighbors. Holy shit! From what I hear about power usage on here you would think you are tapping into the main grid juicing the shit out of the power with these video cards. That is simply not true at all.Not gonna lie... my gaming room / office is on the 2nd floor of my house, and it will get warm in the summer when gaming full load, but I don't buy a $2K video card and care about the costs. What did I do? I bought a portable AC for just that room and run it when i game in the summer... lol. MAX overlock and power settings or bust!I also get emails every month from my power company saying I'm using 200% or more electricity a month than my neighbors... IDGAF.
![]()
In all fairness, my house is older and the insulation sucks upstairs so it gets pretty toasty in the summer with a PC pumping out heat too. The Wife and I like it cool, so the AC runs almost 24/7 in the summer anyway to keep it at 66F, then add a portable AC that runs to keep my office cool when I am using it and it starts to add up with everything else that uses power. The number of +200% sounds big, but the cost is less than what I would use if I had to drive to work everyday and get gas, so its still a win for me with WFH.Thats funny because National Grid sends me emails telling me I'm 20% under power usage than my neighbors and I run a 3090 at full tilt overclocked. My electric bill is no more than 60$ per month here in Worcester MA. You guys must be using the fuck outta your utilities to go over 200% than your neighbors. Holy shit! From what I hear about power usage on here you would think you are tapping into the main grid juicing the shit out of the power with these video cards. That is simply not true at all.
I care about the extra heat during the summer, and the noise needed to cool it. So power and heat are definitely aspects of the decision for me.Well do whatever makes you happy that's important. When I buy a video card heat and power usage doesn't even register in my brain. All I care about is performance.
Absolutely agree with this GoldenTiger. Word for word. Heat in the summer and the noise needed to cool it mean everything because performance is a given, heat and noise are not lolI care about the extra heat during the summer, and the noise needed to cool it. So power and heat are definitely aspects of the decision for me.![]()
A box that is pumping out 600W of heat is going to affect the temperature of the room it's running in, regardless. I don't care so much about the power usage as I care about managing the room temperature. I don't enjoy swimming in ball soup.I have never in my life ever ever ever felt the heat in my apartment from a video card? What do you guys live in shoeboxes? I have had my pc on full tilt in a one bedroom apartment and have been freezing my balls off. Where do you guys live on the Sun?
Who buys a 2 THOUSAND DOLLAR video card and goes gee I can't wait to save money on electricity and have 0 heat output cause where I live I feel the tiniest of thermal changes?????
Is this a real argument? I have built high end rigs since 1990 and never once even thought about electricity cost or thermals that could affect an entire room??? What?! I have personally never felt any change in temp anywhere ever from my pc.
Maybe not everyone lives in their parents unfinished basement. /sI have never in my life ever ever ever felt the heat in my apartment from a video card? What do you guys live in shoeboxes? I have had my pc on full tilt in a one bedroom apartment and have been freezing my balls off. Where do you guys live on the Sun?
Who buys a 2 THOUSAND DOLLAR video card and goes gee I can't wait to save money on electricity and have 0 heat output cause where I live I feel the tiniest of thermal changes?????
Is this a real argument? I have built high end rigs since 1990 and never once even thought about electricity cost or thermals that could affect an entire room??? What?! I have personally never felt any change in temp anywhere ever from my pc.
your generation surely does love running its mouth. I worked hard my whole life and did it on my own and I'm not some f****** teenager. But yet you talk as if you know me unreal.Maybe not everyone lives in their parents unfinished basement. /s
He was being sarcastic. Also, if you are not a teen, why did you asterix out the f word? Only teens tend to do that... note I did not put a "/s" on this as the other poster did.your generation surely does love running its mouth. I worked hard my whole life and did it on my own and I'm not some f****** teenager. But yet you talk as if you know me unreal.
Your mom clearly registered an account for you before you were born.Yeah I didn't see the sarcastic mark. I just don't talk s*** regardless if it's sarcasm or not. I'm talking into my phone that's why the curse words are being outed with asterix. I have been on this site for nearly 20 years so the de facto truth and logic would be I'm not a teenager at all. I'm actually 48 years old. Anyways have a good day take care.
The upstairs bedroom I have my PC in got noticeably hotter when running high wattage cards pumping out 600+w system total wattage. Think about it, it's like a space heater... I have to run ac more to make up for it. I'm not complaining about that part really do much as the heat in the first place.I literally do not get the heat is important in my house gang or the electricity bill is too high gang when buying a $2,000 video card that is meant for strictly gaming. I live in Massachusetts and need to turn the heat up every day because it's so freaking cold. These cards run cool as s***. Is anyone actually complaining that the 4090 is dumping loads of heat into their house making it hot? Do you guys live in the Sahara desert?
Yeah man that sucks that you noticed the heat difference. I can't heat up my apartment enough. I literally notice absolutely no difference between my 3090 or 4090 here in massachusetts. I can't feel the heat at all. Like an icebox right now.The upstairs bedroom I have my PC in got noticeably hotter when running high wattage cards pumping out 600+w system total wattage. Think about it, it's like a space heater... I have to run ac more to make up for it. I'm not complaining about that part really do much as the heat in the first place.
Nice... Yeah my bedroom is on a third floor so it gets warm up there when it isn't winter. I also am in the northeastYeah man that sucks that you noticed the heat difference. I can't heat up my apartment enough. I literally notice absolutely no difference between my 3090 or 4090 here in massachusetts. I can't feel the heat at all. Like an icebox right now.
That could be more because of the cooling solution more than the amount of watt they dumb in the room, look at power usage and you have the actual amount of heat not temperature. To have an idea how different heat and temperature can be, when you put your hand in a oven the temperature is higher than in boiling water, it does not feel that way (way more heat-energy is transferred to your hand in boiling water despite a significantly lower temp)These cards run cool as s***
Obviously no one mind the heat during cold temperature, this is trivial. Do you not know that a lot of people have AC system in their house because they sometime feel it is too hot ?Yeah man that sucks that you noticed the heat difference. I can't heat up my apartment enough. I literally notice absolutely no difference between my 3090 or 4090 here in massachusetts. I can't feel the heat at all. Like an icebox right now.