How many 4080/4090 owners plan to upgrade to a respective 50 series when available?

Median household income was $57K in 2016 in the USA. Using new Nvidia math (2x the cost), today's median income must be $144K, but alas, no it's only about $78K.

I remember when the cost of technology became less over time. I mean, even for "advances". That is, the cost of a top tier computer was less than what it used to be, etc. I really miss those days.

Looking forward to "Nvidia days". But sadly, my salary is likely to follow roughly cost of living, inflation. Sometimes it might go up more for merit, but probably not 2x.



This has literally been the case for decades now. Wages have not kept up with the cost of living over time and it's only going to get worst from the looks of it. So yeah people can complain about GPUs being overpriced and expensive but so is everything else these days.
 
I know this thread is about upgrading to a 5090 but my question is... where the hell are the 4090s? There's zero stock out there unless you're willing to pay scalper prices. :banghead:
I heard that China is actively buying out 4090s then taking them apart to use for AI in data centers! It's pretty insane. How do consumers compete with an entire country?

Article:
https://www.datacenterdynamics.com/...m, and amid,on custom-designed circuit boards.

Maybe Nvidia needs to limit video cards per market segment so normal consumers actually stand a chance?
 
IMHO, this thread has proven that Nvidia could take the price to "infinity" and people here would have a "perfect justification" for buying it. Indeed, what a world we live in.

(I think I'll sit back and watch)
I don't like Nvidia's greed. But currently there is nobody making high end graphics cards to compete.

AMD has chosen to "settle" for 2nd place by pursuing profitability instead. They are going for the console market mainly.

Their desktop GPUs only aspire to compete with Nvidia's 2nd tier and they have decided not to pursue the performance crown at all.

Intel is a newcomer to the GPU race. Maybe they'll bring out something special for their next Arc generation but it feels for now like they are just 'happy to be here' and eking out a bit of mainstream share here and there.

We don't really have an alternative at the moment. I carry no special loyalty to Nvida.

I've used their products before. The 1080 was probably my favorite video card I've used by them.

I skipped the 20 series and 30 series and got an AMD RX 6800 to get by, and it's a good little card.

But if I'm anticipating running anything at 4k 120 hz+ or even 3440p 175hz+ I'm going to need to go with Nvida for their next gen.

That probably means I have to pony up for a 5090 and then hope it lasts a good 3 years+ before I need to even consider an upgrade.
 
I don't like Nvidia's greed. But currently there is nobody making high end graphics cards to compete.

AMD has chosen to "settle" for 2nd place by pursuing profitability instead. They are going for the console market mainly.

Their desktop GPUs only aspire to compete with Nvidia's 2nd tier and they have decided not to pursue the performance crown at all.

Intel is a newcomer to the GPU race. Maybe they'll bring out something special for their next Arc generation but it feels for now like they are just 'happy to be here' and eking out a bit of mainstream share here and there.

We don't really have an alternative at the moment. I carry no special loyalty to Nvida.

I've used their products before. The 1080 was probably my favorite video card I've used by them.

I skipped the 20 series and 30 series and got an AMD RX 6800 to get by, and it's a good little card.

But if I'm anticipating running anything at 4k 120 hz+ or even 3440p 175hz+ I'm going to need to go with Nvida for their next gen.

That probably means I have to pony up for a 5090 and then hope it lasts a good 3 years+ before I need to even consider an upgrade.
I also think the idea of complete replacement every 3 years is interesting....

Again, I'll keep a watch.... why not every 2 years? Every year?
 
But if I'm anticipating running anything at 4k 120 hz+ or even 3440p 175hz+ I'm going to need to go with Nvida for their next gen.

That probably means I have to pony up for a 5090 and then hope it lasts a good 3 years+ before I need to even consider an upgrade.
4K has been max dollar video card or cards, since the monitors came out. That is why it's such a tiny % of the market. Not only that but you have to have a pretty big monitor and sit close to it, to even notice the difference from say a 1440p monitor. I ran 1080p for a long time before I finally upgraded my monitor to 1440p as the cards finally had the power to run it well without needing to be the top video card.
 
I also think the idea of complete replacement every 3 years is interesting....

Again, I'll keep a watch.... why not every 2 years? Every year?
Deciding to upgrade will vary depending on the user's own needs at the time.

As for WHEN to upgrade?

In the original days, there was Moore's Law which theorized that transistor count and thus computing power would roughly double every 2 years.

article: https://ourworldindata.org/moores-law#:~:text=The observation that the number,founder of Intel, in 1965.

Add in development time and process maturation to the recipe as well as manufacturers ramp up their factories to produce the product, but then gain efficiency in price to production the more they sell and as their processes mature.

So one might theorize that it would actually take 3 years or more for a product to become most profitable for the manufacturer.

As for the buyer, the product rarely goes below MSRP (manufacturer's retail price) until a new product comes out. So in theory buying the second to latest generation of product is the most efficient for bang for buck. The prior generation of product will get price cuts when the newest one comes out.

However there's also a case to be made for buying the latest product and then SELLING your previous product on the second hand market to recoup your costs.

Thus making upgrades cheaper because you are selling a still relatively attractive, prior generation product at the highest value you could sell a second hand product for.

Simple Math exercise:

Customer X buys a 3090 on release date September 24, 2020 for $1499.

He uses it for 2 years enjoying top of the line performance.

Customer X then buys a 4090 on release date October 12, 2022 for $1599.

He sells his second hand 3090 for $800 on the second hand market.

His upgrade cost for the 4090 is effectively halved or only $799.
 
Last edited:
I don't like Nvidia's greed. But currently there is nobody making high end graphics cards to compete.

AMD has chosen to "settle" for 2nd place by pursuing profitability instead. They are going for the console market mainly.

Their desktop GPUs only aspire to compete with Nvidia's 2nd tier and they have decided not to pursue the performance crown at all.

Intel is a newcomer to the GPU race. Maybe they'll bring out something special for their next Arc generation but it feels for now like they are just 'happy to be here' and eking out a bit of mainstream share here and there.

We don't really have an alternative at the moment. I carry no special loyalty to Nvida.

I've used their products before. The 1080 was probably my favorite video card I've used by them.

I skipped the 20 series and 30 series and got an AMD RX 6800 to get by, and it's a good little card.

But if I'm anticipating running anything at 4k 120 hz+ or even 3440p 175hz+ I'm going to need to go with Nvida for their next gen.

That probably means I have to pony up for a 5090 and then hope it lasts a good 3 years+ before I need to even consider an upgrade.
AMD share holders viewed PC enthusiast market as a dead end or dying slowly anyway. Therefore won't be any need to complete in this space, their desktop cards are just beta test devices for consoles, handheld, and other gaming platforms where real profit is to be made, not PC.
 
AMD share holders viewed PC enthusiast market as a dead end or dying slowly anyway. Therefore won't be any need to complete in this space, their desktop cards are just beta test devices for consoles, handheld, and other gaming platforms where real profit is to be made, not PC.
I literally already said that…

-quote-
AMD has chosen to "settle" for 2nd place by pursuing profitability instead. They are going for the console market mainly.
-unquote-

My point still stands that Nvidia stands alone at producing bleeding edge top of the line discrete gpus. Therefore enthusiast video card buyers literally have no other choice but to pick Nvidia for high end solutions right now which allows them to act as greedy as they want.
 
I don't like Nvidia's greed. But currently there is nobody making high end graphics cards to compete.

AMD has chosen to "settle" for 2nd place by pursuing profitability instead. They are going for the console market mainly.

Their desktop GPUs only aspire to compete with Nvidia's 2nd tier and they have decided not to pursue the performance crown at all.

Intel is a newcomer to the GPU race. Maybe they'll bring out something special for their next Arc generation but it feels for now like they are just 'happy to be here' and eking out a bit of mainstream share here and there.

We don't really have an alternative at the moment. I carry no special loyalty to Nvida.

I've used their products before. The 1080 was probably my favorite video card I've used by them.

I skipped the 20 series and 30 series and got an AMD RX 6800 to get by, and it's a good little card.

But if I'm anticipating running anything at 4k 120 hz+ or even 3440p 175hz+ I'm going to need to go with Nvida for their next gen.

That probably means I have to pony up for a 5090 and then hope it lasts a good 3 years+ before I need to even consider an upgrade.
If AMD competed with NVIDIA in the enthusiast tier then they would be pricing those cards close to what NVIDIA is charging. I know people like to say "NVIDIA bad," but you need to accept the reality of materials cost, especially when it comes to the silicon.
 
If AMD competed with NVIDIA in the enthusiast tier then they would be pricing those cards close to what NVIDIA is charging. I know people like to say "NVIDIA bad," but you need to accept the reality of materials cost, especially when it comes to the silicon.
Material cost is not that much higher, thus why profits have surged. Auto industry profits have surged for the same reasons, they are charging way more and making much higher profits.
 
Material cost is not that much higher, thus why profits have surged. Auto industry profits have surged for the same reasons, they are charging way more and making much higher profits.
4/3nm wafers from TSMC are 50% more expensive than 5nm. 4/3nm wafers are 100% more expensive than the Samsung 8nm node used in Ampere. TSMC 2nm is expected to be 33% more expensive than 4/3nm.
 
nV could give three shits what we think. They are probably like keep on bitching and we'll raise prices even more. Who cares if you can't afford it. We'll do what we want.
 
Buying quadros next time. Moved from PC gaming to consoles and man what an improvement. The quadros are better for what makes me money anyways.
 
Last Quadro was in 2018, I doubt they will be better by then, if the drivers are still being updated (Kepler quadro stopped in 2022 I think)

do you mean the RTX ada6000 / L40 stuff that replaced them ?

where are you seeing improvment?
CAD software suite would be a common case:
https://techgage.com/article/nvidia-quadro-rtx-4000-review/4/
https://www.servethehome.com/nvidia-rtx-6000-ada-graphics-card-review-pny/4/

iewport-Performance-NVIDIA-Quadro-RTX-4000-680x451.png
NVIDIA-RTX-6000-Ada-SPECviewperf2020-solidworks-07.jpg
NVIDIA-RTX-6000-Ada-SPECviewperf2020-snx-04.jpg
 
The 5000 series will be interesting. I suspect the main question will change from "how many fps will I get in Crysis/Avatar/etc" to "how many tps will I get from Mixtral-8x7B".
 
The 5000 series will be interesting. I suspect the main question will change from "how many fps will I get in Crysis/Avatar/etc" to "how many tps will I get from Mixtral-8x7B".

I'm expecting an even bigger leap in the newest games since it looks like they are going to require RT and Nvidia should be making another solid leap in RT performance with 5000. Perhaps I'm wrong but I think Avatar uses some kind of RTGI and you can't disable it? I know with Spiderman 2 the devs have RT enabled for every single mode on PS5 so when it comes to PC it may be the case where RT is always going to be on as well. So if the newest games are always going to be using RT then 5000 should have a bigger leap in those titles.
 
Last Quadro was in 2018, I doubt they will be better by then, if the drivers are still being updated (Kepler quadro stopped in 2022 I think)

do you mean the RTX ada6000 / L40 stuff that replaced them ?


CAD software suite would be a common case:
https://techgage.com/article/nvidia-quadro-rtx-4000-review/4/
https://www.servethehome.com/nvidia-rtx-6000-ada-graphics-card-review-pny/4/

View attachment 621378View attachment 621379View attachment 621380
I was referring tohis comment about the consoles being such an improvement. Not RTX vs Quadro
 
I was planning on upgrading from my 6900 to either a 5090 or AMD's RDNA4 competitive equivalent. But after the rumors have become near fact that RDNA4 will likely have no high end chip, NVidia will almost certainly further overcharge will self-sandbagging performance. Looks like I'll be waiting for RDNA 5 or the 6090.

4/3nm wafers from TSMC are 50% more expensive than 5nm. 4/3nm wafers are 100% more expensive than the Samsung 8nm node used in Ampere. TSMC 2nm is expected to be 33% more expensive than 4/3nm.

And yet despite that NVidia seems to have much fatter margins. The extra money charged over and above increased die cost is going somewhere.
 
I was planning on upgrading from my 6900 to either a 5090 or AMD's RDNA4 competitive equivalent. But after the rumors have become near fact that RDNA4 will likely have no high end chip, NVidia will almost certainly further overcharge will self-sandbagging performance. Looks like I'll be waiting for RDNA 5 or the 6090.



And yet despite that NVidia seems to have much fatter margins. The extra money charged over and above increased die cost is going somewhere.

Rumors are nothing more than that, rumors. Rumors also said RDNA3 would be 3x faster than RDNA2.
 
Rumors are nothing more than that, rumors. Rumors also said RDNA3 would be 3x faster than RDNA2.
Well, typically when there's a rumor a Radeon card will have higher than expected performance, it's usually false. Rumors of lower than expected performance are typically true. Then there's the very real fact AMD has stated it's pursuing mainstream and not flagship performance GPU's for the foreseeable future.
 
Well, typically when there's a rumor a Radeon card will have higher than expected performance, it's usually false. Rumors of lower than expected performance are typically true. Then there's the very real fact AMD has stated it's pursuing mainstream and not flagship performance GPU's for the foreseeable future.

Where did they explicitly state that they were only pursuing mainstream performance? Not pursuing flagship is kinda sorta what they already did this gen by only competing with a 4080 at the very top and leaving the 4090 uncontested so I can see them doing it again next gen where they end up only competing with up to a 5080. But that still covers just about everyone except for the ultra enthusiast.
 
Where did they explicitly state that they were only pursuing mainstream performance? Not pursuing flagship is kinda sorta what they already did this gen by only competing with a 4080 at the very top and leaving the 4090 uncontested so I can see them doing it again next gen where they end up only competing with up to a 5080. But that still covers just about everyone except for the ultra enthusiast.
Tell you what. If they create a 5090 equivalent, i'll buy you that card. If they don't, you buy me a 5090. Lets see if your faith in AMD is as real as you suggest or just BS.
 
Tell you what. If they create a 5090 equivalent, i'll buy you that card. If they don't, you buy me a 5090. Lets see if your faith in AMD is as real as you suggest or just BS.
Maybe I do not follow, but you quoted someone that just predicted that 2024 top end AMD will not go over the 5080 performance, did not sound confident they will have a 5090 equivalent.
 
Last edited:
Tell you what. If they create a 5090 equivalent, i'll buy you that card. If they don't, you buy me a 5090. Lets see if your faith in AMD is as real as you suggest or just BS.

Wut? I don't expect AMD to make a 5090 competitor seeing as they didn't even bother competing with the 4090 this gen. All I'm suggesting is that people need to stop taking rumors as the gospel about AMD only sticking to mid range and somehow that's BS or some AMD mega faith? Ooooooooookay.
 
Maybe I do not follow, but you quoted someone that just predicted that 2024 top end AMD will not go over the 5080 performance, did some confident they will have a 5090 equivalent.

My point exactly lol. I clearly said I can see them not competing with a 5090 at all yet he's accusing me of having some ultra faith in AMD or something and making a 5090 competitor.
 
My point exactly lol. I clearly said I can see them not competing with a 5090 at all yet he's accusing me of having some ultra faith in AMD or something and making a 5090 competitor.
So you’re saying the same thing as everyone else yet you’re quoting everyone saying it’s just rumor and “when did AMD say this” as if to cast doubt. Makes a lot of sense.
 
So you’re saying the same thing as everyone else yet you’re quoting everyone saying it’s just rumor and “when did AMD say this” as if to cast doubt. Makes a lot of sense.

Uhh no everyone is saying AMD will only compete in the mainstream while I'm saying maybe they can compete with a 5080. So a 5080 is mainstream? And yeah when did AMD say they were only going to compete in the mainstream? You yourself listed it as "very real FACTS" that they said it. So go ahead and show me the facts. I'm waiting.

Well, typically when there's a rumor a Radeon card will have higher than expected performance, it's usually false. Rumors of lower than expected performance are typically true. Then there's the very real fact AMD has stated it's pursuing mainstream and not flagship performance GPU's for the foreseeable future.
 
Last edited:
Uhh no everyone is saying AMD will only compete in the mainstream while I'm saying maybe they can compete with a 5080. So a 5080 is mainstream? And yeah when did AMD say they were only going to compete in the mainstream? You yourself listed it as "very real FACTS" that they said it. So go ahead and show me the facts. I'm waiting.
You’re right they didn’t say it. Thet just cancelled their high end skew. while you’re waiting, wait for their releases and I’ll be back to say I told you so
 
You’re right they didn’t say it. Thet just cancelled their high end skew. while you’re waiting, wait for their releases and I’ll be back to say I told you so

Whatever makes you happy. I'll be getting a 5090 anyway so whatever AMD ends up making won't affect any of my purchasing decisions. I just personally think they can at least do better than mainstream GPUs like a 5060 Ti, but I definitely believe they won't be touching a 5090.
 
Back
Top