Best 'Budget' 4080 Card. Quiet & Cool. No manual OC.

ElevenFingers

Limp Gawd
Joined
May 30, 2008
Messages
189
Hi,

With the rather lackluster specs of the 4070 and 4070ti (VRAM specifically), I'm considering buying a 4080 instead. I may wait until AMD releases a few newer cards, however the 7900XT and XTX are not suitable, due to their high power consumption to performance ratios. I'm also considering a few other cards but want to limit this thread to the discussion of 4080 models.

Electricity costs .50 cents per kWh in my area, and my desktop runs in a shared home office. For these reasons, silence and efficiency are key. I don't care what the card looks like, just how it performs.

In my area, the following cards are available for less than 1300 euros, the absolute most I'm willing to spend on a card. Links to manufacturer's product pages are provided:

Inno3D GeForce RTX 4080 16GB X3
Inno3D GeForce RTX 4080 16GB X3 OC

Is it safe to assume the only difference between these two cards is a 30 GHz factory overclock for a 20 euro price increase? If so, I think I can rule out the OC.
I haven't found any professional reviews of these cards, only user reviews. Anyone have any additional insights?

PNY GeForce RTX 4080 16GB XLR8 Gaming VERTO EPIC-X RGB Triple Fan
PNY GeForce RTX 4080 16GB VERTO Triple Fan Edition

Is the only difference between these two, size and RGB? I found these reviews on TechPowerUp and PCGamer for the EPIC-X.
I saw mentioned elsewhere, that the cooling of the non-EPIC-X is inferior but I haven't found much beyond that.
The EPIC-X is actually 10 euros cheaper, so does that make the non-EPIC-X a non-starter or am I missing something?

MSI GeForce RTX 4080 16GB VENTUS 3X OC

This card seems to be fairly popular based on its ranking on PCPartPicker. Is there something that makes it stand out?

Gigabyte GeForce RTX 4080 16GB Windforce

I am slightly biased towards this card due to it having the same name as my favorite D2LoD item.
It seems to be the newest release and I can't find much information about it. It's considerably smaller than the Eagle, which would not fit in my case. Does anyone have more information about it?
 
That electricity price is ouch, but your actual costs probably come out to $10 a month for the pc making a wild ass guess without knowing your average pc usage.

But if the electricity price is really a big concern, you can set lower power limit and still get great performance out of the 4xxx cards. I benched a 4090 at 70% power target, went from 450W to 370W (full load from running a benchmark), and it lost like only 6% performance. I would think a 4080 would show similar behavior, and if you search youtube can probably find someone with a 4080 who has already done it and reported their results.

Those 2 PNY's have identical specs, so just choose whichever you like best.
 
Last edited:
Like usual, I made a quick spreadsheet to estimate costs. A delta of 200 watts is worth around 200 euros per year, assuming 4 hours of intensive use, 5 days per week, 52 weeks per year. This is on the high side of usage. In reality, my intensive usage might be half that.

Over the life of of a card (3+ years minimum) I'm looking at 300-600 euros in savings when comparing a 4090 at 450 watts and a yet to be released 250 watt 16GB VRam card (7800xt or 7700xt maybe?).

For each card on the market that meets my current requirement minimum, I calculate total cost of ownership over 3 years (including estimated resale value) and then base my performance per dollar on those estimates. Whichever is highest, is the one I buy.

For this reason I went with the RX 6600 for now. It's cost effective and quiet and will get me through the next year or so until affordable and efficient cards become available (Intel Battlemage, AMD or a cheaper 4080 perhaps?).

So an answer to the OP is no longer needed, though I welcome a discussion.
 
I feel the pain, my electricity bill works out to .52 per unit and it’s going up again soon. I ended up getting a solar array for my house since it’s only going to increase from here. That being said, I run my 4090 at 80% power target and it basically loses no performance while running much cooler and quieter. I would guess the 4080 could do so as well.
 
Yes, price/performance/efficiency with 16 GB+ of VRAM points to the 4080/4090.

The issue is they cost a quarter black-market kidney. I'm hoping things look better in 6 - 12 months, though I may eat my words if humanity decides to collectively burn a mid-sized country's annual energy budget producing crypto-AI-doge-GPT-coins using home-made expensive radiators running 24/7 in people's homes.

In the meantime the 6600 is performing great compared to my 970, so I'm a happy camper.
 
Yes, price/performance/efficiency with 16 GB+ of VRAM points to the 4080/4090.

The issue is they cost a quarter black-market kidney. I'm hoping things look better in 6 - 12 months, though I may eat my words if humanity decides to collectively burn a mid-sized country's annual energy budget producing crypto-AI-doge-GPT-coins using home-made expensive radiators running 24/7 in people's homes.

In the meantime the 6600 is performing great compared to my 970, so I'm a happy camper.

Yeah the 6600 is a good little card. I have an XFX model and it works great even at 1440p.
 
Does anyone know if it's card design or drivers for the high power consumption for the rx 7900 series for multi-monitor and video playback? The power consumption is comparatively high to the new Nvidia cards at other tasks but it isn't noticeably bad - but, those two tasks - it's strangely high - and I know the multi monitor power consumption has provoked discussion - but, wondering if it's solved yet.
 
I'm hoping things look better in 6 - 12 months

Maybe for the 4080, but there's an almost 0% chance that the 4090 will drop in price; they just haven't had any need to do that with their last 3 gen of flagships since there are more than enough people out there willing to pay the price. It's more likely that a Ti variant would come out to supplant the existing 4090, but don't hope for any discounts on existing stock.
 
Does anyone know if it's card design or drivers for the high power consumption for the rx 7900 series for multi-monitor and video playback? The power consumption is comparatively high to the new Nvidia cards at other tasks but it isn't noticeably bad - but, those two tasks - it's strangely high - and I know the multi monitor power consumption has provoked discussion - but, wondering if it's solved yet.
It's not solved, it's part of their known issues in their driver release from today.
 
I hate the 4080 being the successor to the 3080 as it only offers 4GB more RAM then the late model and is near twice the msrp.
That said my number one app is VR and until AMD provides some reliable competition in this realm I am as well stuck and waiting for a price drop.
Feels like paying $50 for a burger! Ya it is a fucking tasty burger to be sure but my common sense says WTF! $16GB for $1200 really!
Sigh...
 
I hate the 4080 being the successor to the 3080 as it only offers 4GB more RAM then the late model and is near twice the msrp.
That said my number one app is VR and until AMD provides some reliable competition in this realm I am as well stuck and waiting for a price drop.
Feels like paying $50 for a burger! Ya it is a fucking tasty burger to be sure but my common sense says WTF! $16GB for $1200 really!
Sigh...
I am also a believer that AMD hit a rushed production wall, or had tech issues with the advancements they are trying to transfer in concept from the CPU to GPU divisions and hit a road block of sorts and it affected performance. Although oddly the 7000 series CPU line also is pumping high core temps even if the card is fine.
Is this a design conflic with what TSMC promissed and a far off goal of what was manufactured? Nvidia did right the time they had to go with Samsung. It reduced performance and took more overall Watts, but the GPU ran stable and fine. Good choice and they were still more then competitive.
Personally I think the 7000 series GPU should have been a Spring 2023 launch, but the powers that be saw that as a sign of weakness in the brand and pushed forth for a advertised target release. And it fucked them. Some parallels to the 7000 CPU launch esp the 3D's I think.
 
Last edited:
I hate the 4080 being the successor to the 3080 as it only offers 4GB more RAM then the late model and is near twice the msrp.
That said my number one app is VR and until AMD provides some reliable competition in this realm I am as well stuck and waiting for a price drop.
Feels like paying $50 for a burger! Ya it is a fucking tasty burger to be sure but my common sense says WTF! $16GB for $1200 really!
Sigh...
For my and similar use cases where energy is a significant factor, the 4080's value proposition over it's lifetime is actually in line with the rest of the lineup and better value than the 30 series. It's only when you exclude efficiency or your energy is negligibly cheap that it stands out as a poor card for its price. I think this is why the 4070 and 4080 are more popular in Europe than the US. It might also be why the 6700 non-x CPU is much more expensive and hard to get here than in the US. In Europe we're much more sensitive to energy price to performance than just purchase price to performance.

If parallel markets develop where the 4080 becomes significantly cheaper in the US over the next year, it would be one additional reason to visit the US. A few hundred bucks in savings covers a good portion of the flights.
 
Maybe for the 4080, but there's an almost 0% chance that the 4090 will drop in price; they just haven't had any need to do that with their last 3 gen of flagships since there are more than enough people out there willing to pay the price. It's more likely that a Ti variant would come out to supplant the existing 4090, but don't hope for any discounts on existing stock.

The only way we’ll see discounts on existing stock is if Nvidia and AMD issue rebates to their distribution and retail chain, and based on the most recent strategy of “last year’s GPU is now this year’s lower tier, so pay more for this year’s”, I don’t see that happening unless there is an extreme oversupply issue, and even then I can see them simply delaying consumer releases and waiting it out for a bit at this point.

My hope is that we get a repeat of Ampere, where they realized Turing was overpriced due to a misguided attempt to sell it all to miners, but that remains to be seen. This looks like a potential repeat of that though.
 
For my and similar use cases where energy is a significant factor, the 4080's value proposition over it's lifetime is actually in line with the rest of the lineup and better value than the 30 series. It's only when you exclude efficiency or your energy is negligibly cheap that it stands out as a poor card for its price. I think this is why the 4070 and 4080 are more popular in Europe than the US. It might also be why the 6700 non-x CPU is much more expensive and hard to get here than in the US. In Europe we're much more sensitive to energy price to performance than just purchase price to performance.

If parallel markets develop where the 4080 becomes significantly cheaper in the US over the next year, it would be one additional reason to visit the US. A few hundred bucks in savings covers a good portion of the flights.
Even including higher energy costs, it’s a poor card for its price. It’s double the price of the card it’s supposed to be replacing.
 
I hate the 4080 being the successor to the 3080 as it only offers 4GB more RAM then the late model and is near twice the msrp.
That said my number one app is VR and until AMD provides some reliable competition in this realm I am as well stuck and waiting for a price drop.
Feels like paying $50 for a burger! Ya it is a fucking tasty burger to be sure but my common sense says WTF! $16GB for $1200 really!
Sigh...
A used 3090 is how much $$ in your area market? I wouldn't even look at a new 30 series at this point in time. Yeah, there's some risk looking at used but there's so many out there on the used market, too. I really like the Asus Tuf series - very quiet and cool running card. The 3080 Tuf supposedly uses the 3090 cooler. A 3090 Ti is probably a bit more $$ - at least, here. They have 24gb - wouldn't they offer what you need?
 
It's not solved, it's part of their known issues in their driver release from today.
I guess that's why some buyers stick with Nvidia? AMD needs to get their **** together - it seems that every gen., they have these kinds of issues - and they have a rep for problematic drivers. I suppose AMD is working on it.
 
FWIW, I have an MSI 4080 Ventus 3X OC and run it at 93% power limit on stock clocks. It's operated fine like this for months for both compute and gaming. It pulls < 300W, which is about a 15% reduction in power usage compared to the 3080 it replaced. This is in my upstairs office, so it's my daily driver, and it is noticeably cooler and quieter than the 3080 was. The 3080 was a great GPU in that I rarely heard it, and only noticed the heat when gaming for longer periods of time in the afternoon when my office is hottest. The 4080 is even better. Its only downside was its cost but I got it for just under $1,000 so it was an easier purchase for me to make.

I also just picked up a 4070 FE and have it running at 95% power limit on stock clocks. Again, works fine for both compute and gaming. It pulls < 200W, about a 20% reduction in power usage compared to the 3070 FE it replaced. Again, its only downside was its cost, but I paid $480 for it, and at that price, I think it was a great value.
 
I guess that's why some buyers stick with Nvidia? AMD needs to get their **** together - it seems that every gen., they have these kinds of issues - and they have a rep for problematic drivers. I suppose AMD is working on it.
The 3000 series had idle power issues at launch as well, took a few months. This is a pretty big architectural change I'm not surprised.
 
A used 3090 is how much $$ in your area market? I wouldn't even look at a new 30 series at this point in time. Yeah, there's some risk looking at used but there's so many out there on the used market, too. I really like the Asus Tuf series - very quiet and cool running card. The 3080 Tuf supposedly uses the 3090 cooler. A 3090 Ti is probably a bit more $$ - at least, here. They have 24gb - wouldn't they offer what you need?
Thanks for your reply Pavel. But that wasn't the point I was getting at.
 
That electricity price is ouch, but your actual costs probably come out to $10 a month for the pc making a wild ass guess without knowing your average pc usage.

But if the electricity price is really a big concern, you can set lower power limit and still get great performance out of the 4xxx cards. I benched a 4090 at 70% power target, went from 450W to 370W (full load from running a benchmark), and it lost like only 6% performance. I would think a 4080 would show similar behavior, and if you search youtube can probably find someone with a 4080 who has already done it and reported their results.

Those 2 PNY's have identical specs, so just choose whichever you like best.
Yep, great cards to lower power limits on. Unlike Ampere, Lovelace cards are voltage limited, not power limited. So lowering the power usage a lot hardly loses anything performance wise on the 4090. I likewise imagine similar for the 4080.
 
Thanks for your reply Pavel. But that wasn't the point I was getting at.
I can guess the point you were making - but, Nvidia has been criticized by gamers for the amount of VRAM their cards get - the 16gb from 12gb is pretty typical? If you look at the next gen of cards - 12 -> 16gb and the 4090 maintains the 24gb - it's a good bet that Nvidia's strategy is: if you want more Vram - you need to go up a level to the flagship card?
If you want high vram choices for the gpu: you have to go with AMD - 7900 XT - 20gb and XTX 24gb.

https://www.reddit.com/r/radeon/comments/12yuf4t/amd_chose_vram_over_rtai_cores_hindsight_great/

Many speculate that this is an intentional 'omission?' of VRAM from Nvidia's part - "to force you to buy a new card in x months/years time.' It sounds like a plausible theory to me. I guess it doesn't help gamers who want cards for VR (Nvidia is better, right now, for that?).
 
Does anyone know if it's card design or drivers for the high power consumption for the rx 7900 series for multi-monitor and video playback? The power consumption is comparatively high to the new Nvidia cards at other tasks but it isn't noticeably bad - but, those two tasks - it's strangely high - and I know the multi monitor power consumption has provoked discussion - but, wondering if it's solved yet.
the 6000 series does not have this problem. I got a 6950XT on sale and undervolted it i went from stock 325watts max to now it will go up and down based on the game/load usually around the 250watt mark. I think thats comparable to a 4070 ti
 
the 6000 series does not have this problem. I got a 6950XT on sale and undervolted it i went from stock 325watts max to now it will go up and down based on the game/load usually around the 250watt mark. I think thats comparable to a 4070 ti
He's talking about multimonitor desktop wattage, not gaming. Also, a 4070ti can be undervolted for gaming too.
 
He's talking about multimonitor desktop wattage, not gaming. Also, a 4070ti can be undervolted for gaming too.
yeah but 4070 ti =$800 6950XT = $600 to 649.99 depending on model. that could be a savings of $200-$150 dollars. For the same level of performance.
 
yeah but 4070 ti =$800 6950XT = $600 to 649.99 depending on model. that could be a savings of $200-$150 dollars. For the same level of performance.
I've read that the 6950 XT is often recommended to be chosen over the 4070 Ti UNLESS.... you want RT and (prefer/favor) DLSS?
 
Yeah I don't think the RT performance of the 4070 ti is worth extra money. I can run Cyberpunk 2077 at 1440P with all graphics settings maxed out with RT on ultra and get a constant 60fps on the 6950XT. I know the 4070ti might hit 80-90fps not sure but it's the only game I can see getting the 4070 ti over the 6950XT for. I think 4070 ti is a good card. The main problem is Nvidia is trying to charge $800 for a 12GB Vram card in May of 2023.
 
Yeah I don't think the RT performance of the 4070 ti is worth extra money. I can run Cyberpunk 2077 at 1440P with all graphics settings maxed out with RT on ultra and get a constant 60fps on the 6950XT. I know the 4070ti might hit 80-90fps not sure but it's the only game I can see getting the 4070 ti over the 6950XT for. I think 4070 ti is a good card. The main problem is Nvidia is trying to charge $800 for a 12GB Vram card in May of 2023.
Have you heard the rumor of Nvidia releasing a 4070 Ti 16gb version? It probably wouldn't be $800 USD but they really need to release that - it would be a decent card. However, they're probably hesitant because what does that mean for their 'regular' 4070 Ti cards and customers who bought it would feel 'swindled?' or something bad? :)
I think a 4070 Ti 16gb would be an 'okay' buy (meaning gritting your team and mumbling but still satisfied you have it) although it'd have to be priced well below a 4080. The 4060 should be 12gb and the 4070 Ti should be 16gb - and lots of ppl are saying this?

Edit #2: Nevermind. I think the rumor is 4070 Super? Or?

 
Last edited:
However, they're probably hesitant because what does that mean for their 'regular' 4070 Ti cards and customers who bought it would feel 'swindled?' or something bad? :)
I can promise you nvidia doesn't give a shit about how any of their customers feel. It's also part of the cost of basically any hobby but especially electronics: early adopters pay a premium. Always have, always will.
 
asus strix oc 4080 if you can get it at a good price
Yes, but I bet that card won't ever be in the 'budget' range anywhere. :) Perhaps, when there's a large enough used market for the 4080? I will suggest the PNY card. Why? Because, it seems to be the cheapest 4080 in the USA and Canada. Also, I read that Nvidia also uses it for their workstation builds to ppl? It seems to get pretty decent reviews - since, there seems to be issues with almost every brand/model - at least, reports of some sort - coil whine, overheating whatever, noise. The PNY has enough positive reviews - if I was on a budget or could only afford/buy the cheapest 4080 - I'd probably get one.

Edit: (added links)
https://www.techpowerup.com/review/pny-geforce-rtx-4080-verto-epic-x-rgb-oc/

https://www.pcgamer.com/pny-geforce-rtx-4080-xlr8-verto-review-performance/

https://www.techradar.com/reviews/pny-geforce-rtx-4080-xlr8-oc

I'd get it - learn how to undervolt it and undervolt it....
 
Used here is $1300-$1400 CAD - equivalent to $1060. Common used 4080s include Asus Tuf and MSI Trio. Other cards don't have transferable warranty - probably don't want to take a chance on other brands - solely, for that reason? If you buy a dud, you're out of luck?
Supposedly, theseASUS and MSI cards are relatively quiet - if they don't have coil whine?
I'm kinda saving up some $$ and I might think about (selling my 3080) getting one these 4080s.
The PNY and Zotac are among the most budget/cheapest cards new - coil whine complaints/reports are low - but, these are super expensive compared to the used ones I just mentioned - $1500+ tax - (equiv. $1135 USD - bad deals, eh?).
In comparison, the Gigabyte Eagle and PNY Verto are $1150 ea.
 
Pny for $1000 nib is not bad
Pretty good.
Here's a sample of prices in my country:
Zotac Twin Edge RTX 4070 $799 ($603 USD)
Zotac RTX 4070 Ti TRINITY OC 12GB4070 Ti $980 ($740 USD)
Gigabyte RTX 4080 16GB EAGLE OC4080 $1430 ($1080 USD)
note: most 4080s are starting @ $1500 CAD ($1133 USD)
Here, tax adds $200 - so, a 4080 costs min. $1700 total

Hence, why I'm watching the 2nd hand market - I've seen listed 4080s as low as $1350 CAD.
I'd probably have to sell my 3080. The 'budget' 4080s here, as you can see, are the Zotac and Gigabyte (Eagle) cards.
Although, the 2nd hand cards so far have been Asus Tuf, MSI Trio/ Suprim (most exp.), Zotac and MSI Ventus

I was mostly watching for MSI and ASUS cards for the transferable warranty - although, I guess I can ask if the seller has the receipt?

We don't have Microcenters - but, even with the stores that have open box - the cards aren't discounted much.
 
Buildzoid did a PCB breakdown on a Zotac 4090 yesterday and basically its "if a reference board/design was maxed out". So its better than reference, but not by much.
Yeah it's probably fine and will work. It's just some cards have better coolers on them.
 
Back
Top