Just bought a Evga Rtx 3070 ti Ftw3 can i run it with my current EVGA 500W bronze power supply for 1080p use

BigZuesz

n00b
Joined
Jan 11, 2022
Messages
17
i plan on upgrading to a whole new system in the next month but i want to use this setup temporarily till i get the rest of my parts im going from a micro atx to a mini itx and correct me if im wrong but im not sure the smaller power supply the mini itx builds use will not fit in a big case
My current Specs are B450 Motherboard micro atx,
16 gb ddr4 3100 Mhz corsair vengeance ram,
1tb M.2 SSD,
gtx 1050 ti GPU, "my rtx 3070 ti Ftw3 is coming in a few days"
Ryzen 5 2600X
EVGA 500 BR Bronze Power Supply
 
if you dont game or only play low demand games, it might be alright to get by on.

1646589662436.png
 
yeah i definitely need to upgrade then thanks
no prob. i know people will say you can get by on lower psus but i always recommend to go with at least what the gpu recommends, add more if OCing. especially with the 3000 series as they like to trip up psus. i think GN said its the "transient loads" whatever that means, spikes maybe?....
 
I don't know the power needs for the Ti, I run my 3070 which is a 250-watt card in 3D mode on the Corsair CX 650m that Best Buy carries, they do sales on that unit a lot and it was $59 when I bought my first unit.

I have two of them in different systems and going on 3 years so far without any issues with the other unit doing a RX 580 8Gb in this small case

https://www.newegg.com/red-raijintek-styx-series-micro-atx/p/2AM-002C-00030

It has a MSI B350m Gaming Pro board with Ryzen 1600 AF with 32GBb of memory, it's never given any heat issues being it that case with 4 fans since 2019 and that RX 580 is a hot card by design.

Edit , I missed Pendragon 1 post , 750watt is 100 more then what the 3070 needs .
 
The 3070 Ti can pull pretty close to 300w gaming, and I'm not at all sure that you can count on lower resolution pulling consistently less. Add 100+ for the CPU and you only have 100w left for everything else. I think your 500w supply will work, until it doesn't....

650w would probably do just fine. No harm in going to 750w if you aren't spending a bunch more for it.
 
no prob. i know people will say you can get by on lower psus but i always recommend to go with at least what the gpu recommends, add more if OCing. especially with the 3000 series as they like to trip up psus. i think GN said its the "transient loads" whatever that means, spikes maybe?....
You’d be surprised how quickly power requirements go up when you OC.

As for OPs question: a) yes, you should get a higher capacity PSU, b) compared to the value of your 3070 it, a PSU is a pittance.
 
Low rez dosen't make your GPU work less but possibly more. But as the advice is going I agree on no. Would be a cool test though. Will it crash? Kill PS? Or throttle?
 
FWIW, I've been watching my power consumption closely on the first system in sig (3070Ti, 5800X) and total system power consumption on the AC side (measured by my UPS connected to HWiNFO) is between 450W-550W in-game. GPU alone is 250-290W stock or 300W-380W overclocked depending heavily on the game and GFX settings. The 3070Ti has higher power requirements than the regular 3070 thanks to GDDR6X.
I have a feeling that the power spikes on GA104 cards aren't quite as extreme as GA102 but it's absolutely something to keep in mind since 3070Ti is running GA104 as hard as it'll go. One of these days I'm going to hook up one of my oscilloscopes across the shunts on my card and see for myself how spikey GA104 is since all the tests I've seen are on GA102 cards.
 
Low rez dosen't make your GPU work less but possibly more. But as the advice is going I agree on no. Would be a cool test though. Will it crash? Kill PS? Or throttle?
not if its capped. most likely will trip ocp over and over.
 
Low rez dosen't make your GPU work less but possibly more. But as the advice is going I agree on no. Would be a cool test though. Will it crash? Kill PS? Or throttle?
Don't know where you're getting that nonsense from but that's completely untrue.
 
Yeah the problem lots of people had with the EVGA cards was new world (iirc) black screen where it let fps go completely uncapped.
that was my theory too, something in the output circuit couldnt handle the insane fps spike.
 
it can if you let it rip to 1000fps....
Again a GPU does not use more power at lower resolution. That's about as dumb as saying if you increase the resolution your CPU will use more power. Gee do I really have to go home and make screenshots showing power usage being higher at 4K than 1080p for any game that you can name?
 
Again a GPU does not use more power at lower resolution. That's about as dumb as saying if you increase the resolution your CPU will use more power. Gee do I really have to go home and make screenshots showing power usage being higher at 4K than 1080p for any game that you can name?
ok but do based it on what ive said: 720/1000+ fps vs 4k/60.
 
ok but do based it on what ive said: 720/1000+ fps vs 4k/60.
Okay but what I originally quoted was someone saying that the GPU does not work less at lower resolution and if anything would work harder. That is the complete opposite of reality if all the other conditions are the same.
 
Okay but what I originally quoted was someone saying that the GPU does not work less at lower resolution and if anything would work harder. That is the complete opposite of reality if all the other conditions are the same.
it could if you let the fps run away, no cap. if you issue is with what he said, quote him i guess.
 
There is a common phrase used in these parts. "CPU or GPU limited?" When allowing one or the other NOT to be bottle-necked by the other the fullest of its performance SHOULD be most likely.
And power draw may not be a proper meter for this as the modern GPU is so sophisticated as it has so many controls.
 
Again a GPU does not use more power at lower resolution. That's about as dumb as saying if you increase the resolution your CPU will use more power. Gee do I really have to go home and make screenshots showing power usage being higher at 4K than 1080p for any game that you can name?
That would be great as I think all power users would like to know that data. Thanks. So many vartiables (brand to brand and drivers) so I wish you luck. Seriously looking forward to the data though as I don't recall anyone doing this and will gladly be proven wrong on my hypothesis.
 
I almost always recommend an 850W power supply anymore. The sweet spot for efficiency is about 50% load anyway, but it gives you headroom if you upgrade or add on later too.

If you undervolt the card and maybe run CPU at stock or undervolt it too you could probably scrape by, depends on the quality of the PSU you're using too, but again I would just upgrade anyway to an 850W 80Plus PSU, the lower 80plus are getting very cheap now.
 
I don't know the power needs for the Ti, I run my 3070 which is a 250-watt card in 3D mode on the Corsair CX 650m that Best Buy carries, they do sales on that unit a lot and it was $59 when I bought my first unit.

I have two of them in different systems and going on 3 years so far without any issues with the other unit doing a RX 580 8Gb in this small case

https://www.newegg.com/red-raijintek-styx-series-micro-atx/p/2AM-002C-00030

It has a MSI B350m Gaming Pro board with Ryzen 1600 AF with 32GBb of memory, it's never given any heat issues being it that case with 4 fans since 2019 and that RX 580 is a hot card by design.

Edit , I missed Pendragon 1 post , 750watt is 100 more then what the 3070 needs .
I guess my question is why do you want to trust your expensive system to a $60 PSU that is low end model of Corsair when for like $40-$50 more you can get a solid EVGA PSU..
 
I almost always recommend an 850W power supply anymore. The sweet spot for efficiency is about 50% load anyway, but it gives you headroom if you upgrade or add on later too.

If you undervolt the card and maybe run CPU at stock or undervolt it too you could probably scrape by, depends on the quality of the PSU you're using too, but again I would just upgrade anyway to an 850W 80Plus PSU, the lower 80plus are getting very cheap now.

isnt the reason for the 80+ is because your PSU is more efficient at around 80% load...
 
isnt the reason for the 80+ is because your PSU is more efficient at around 80% load...
Nope, the name 80 Plus came about because the goal was to increase PSU efficiency beyond 80%, meaning 20% or more was wasted in the conversion with heat being a byproduct:

It takes energy to convert the AC power in your outlet to the multiple DC voltages your computer needs (12V and 5V primarily, along with 3.3V). Old PSUs were very inefficient and would waste a bunch of power. The rating starts with 80 Plus (white label) and then bronze, silver, gold, platinum, and titanium as the best. Most of us are just fine with gold or silver, the price difference to go higher or lower doesn't make sense right now.

The 80 plus rating requires efficiency at 20%, 50%, and 100% load measurements, and you get more efficiency running them on 230V power too instead of 115V - usually about 2% more efficient.
 
It depends on where the overpower protection limit is on your PSU. I actually ran a RTX 3090 and mined on it for a year on a 450W PSU and played some games occasionally. But it was a model with very high OPP so the 3090 never ended up tripping it.
 
It depends on where the overpower protection limit is on your PSU. I actually ran a RTX 3090 and mined on it for a year on a 450W PSU and played some games occasionally. But it was a model with very high OPP so the 3090 never ended up tripping it.

My 3090 was fine mining, but as soon as I played certain sections of AC: Valhalla, it would shut off (EVGA 850W Gold something or other). I ended up getting an RMA replacement and it was fine.
 
It depends on where the overpower protection limit is on your PSU. I actually ran a RTX 3090 and mined on it for a year on a 450W PSU and played some games occasionally. But it was a model with very high OPP so the 3090 never ended up tripping it.
Mining doesn't really hit your computer hard. You shouldn't be pulling more then 250w while mining and the rest of the system is really not doing much. You must of not been pushing the card much gaming by limiting the FPS if you managed it on a 450w.
 
I guess my question is why do you want to trust your expensive system to a $60 PSU that is low end model of Corsair when for like $40-$50 more you can get a solid EVGA PSU..
I don't have a high end with 3700x on x470, it's in a old case when the supply was at the top, I mod it to top flip the supply and a hole was cut in top of the case for 120mm fan blowing just on the supply, I don't even think the factory fan has ever came on the supply, so it always cooled even on desktop work and the cooler you can keep power the more can be delivered and it has 2 fans when needed under load.

Also, it rocks dual rails and you will need both to power say a RX 5700 that was flashed and overclocked 25%, so I had already seen that card pulling 225- 230 watts under load and never gave a whip of any issue and I cap my frames at 144 to match the display in HDMI and the 3070 is no different and given me a great time coming back over to team green.

I do have a EVGA supply running a different RX 5700 single 8 pin with a 5600x on B550, it been a great cheap 600 watt so far at $44 unit.

This what a 3070 look's like on PCI Express 3 https://www.3dmark.com/fs/26803164 with that unit,
 
Last edited:
thanks lol yeah unfortunately its bottleneck by my ram and cpu i plan to upgrade in a few months hopefully my ram is clocked at 3200 MHz and my cpu is that ryzen 2600x
maybe a little.
if you feel like playing with it, see if you can bump your ram speed up. give it 1.4v drop to c18/18/18/38ish and up to 3600 or 3400 c16/16/16/32
 
Evga b stock has 1000w gold plus for 80 bux regularly, nuff said. Ping me if u need a code for an additional 10% off.
 
Back
Top