Remembering the disaster that was crypto mining and it's detrimental affects on GPU prices.

Old generation cards like Nvidia's 10xx and 20xx are greatly lowered in price due to irrelevance of the mining for the new newcomers and also they are retired in drivers/technologies for the games
 
detrimental is a word used by those who didn´t make money with their GPU.

That´s ok!

I needed a Linus Sebastian to tell me how easy it is to make money before i believed it.
So i went in very late to the crypto game and got out early.

I Spent 900 on a 3080.
I made enough to incrementally buy 4 more.
Sold them and got a 3080ti for 2000 and a 3070 noctua for 1500 with money to spare
Sold both and now i´m writing from a V3000+, z790 apex with a 13900ks and 4090.

Just because i spent 900 bucks a few years ago and listened to a youtuber.

You see the word detrimental does not always hit the mark since i only spent 900 bucks for all of it - That´s how i see it.
 
detrimental is a word used by those who didn´t make money with their GPU.
What? Increased prices are detrimental to everyone on the consumer side, whether you were mining or not. The question wasn't if mining overall was detrimental to you.
 
Yea but I still want a 4080 or higher cause I'm at 4k. Anything below a 4080 is unattractive to me. I sold a 3080Ti for 650 so that put anything not a 4080 or 4090 into perspective. The 12 VRAM on the 3080Ti was not enough for me. Only 2 choices and their prices are not the best, in fact I feel like they are still a few hundred dollars too overpriced.
 
  • Like
Reactions: pavel
like this
Paid $120 for my 6600XT used (no mining) a couple of months ago, sold my old 960XT 4GB for $40.. same as what I paid for it used in 2018 or so.. no complaints at all.

There have been huge advancements in CPU and GPU capability, along with overall inflation in everything else..

I buy / trade value (and often used), not the latest and greatest / top tier.. This always works out best in the end for me... but I am a very occasional gamer (only running dual 1920x1200 monitors).. to each their own.

If you can resist the hype train, appreciate what you have, and let the market come to you instead of chasing performance like a heroin addict, your wallet will remain fatter. This is a personal discipline choice for me, and not money.. I can afford anything I want, but spending on other things is more important to me.
 
Last edited:
My last GPU I bought (hell my last computer upgrade) was right before my kid was born about 8 years ago, it was that "last present for myself", it was a GTX 970, and I think the 10 series were already coming out but it was back in the days that previous gen cards were sold for massive discounts when new gen stuff came out (anyone memba when that happened? Pepperidge farms remembers!), anyways that 970 did perfectly fine for what I did gaming wise. Anyways, person on here was selling a 3080 pulled out of an Aorus external GPU system for a good price, so I bit, thing was even water cooled although it took a little extra re-plumbing to make the pipes longer so the radiator could get to a good place in a reasonably sized case... and well also needed to upgrade the whole system since I was still rocking a 4790k (that's ddr3 ram people!) and there was what I considered a "good enough" discount on Ryzen 7k, so upgraded that and hopefully it'll be relevant for another 8 years.
 
Paid $120 for my 6600XT used (no mining) a couple of months ago, sold my old 960XT 4GB for $40.. same as what I paid for it used in 2018 or so.. no complaints at all.

There have been huge advancements in CPU and GPU capability, along with overall inflation in everything else..

I buy / trade value (and often used), not the latest and greatest / top tier.. This always works out best in the end for me... but I am a very occasional gamer (only running dual 1920x1200 monitors).. to each their own.

If you can resist the hype train, appreciate what you have, and let the market come to you instead of chasing performance like a heroin addict, your wallet will remain fatter. This is a personal discipline choice for me, and not money.. I can afford anything I want, but spending on other things are more important to me.
Honestly, staying at 1080P/1200P seems like a really good way to keep reasonable expectations and enjoy gaming without spending much money. I love how games look on my 4K panel when I really get the visuals dialed in but gods, the amount of money I've poured into this rig and the amount of time I spend tuning gfx and upscaling to get things looking & performing well...
Occasionally I'll play something at 1080P on a secondary machine or my laptop and am astounded how easy it is to get excellent performance at any settings without ridiculous hardware, it's almost enough to throw me into an existential crisis about my priorities lol
 
Honestly, staying at 1080P/1200P seems like a really good way to keep reasonable expectations and enjoy gaming without spending much money. I love how games look on my 4K panel when I really get the visuals dialed in but gods, the amount of money I've poured into this rig and the amount of time I spend tuning gfx and upscaling to get things looking & performing well...
Occasionally I'll play something at 1080P on a secondary machine or my laptop and am astounded how easy it is to get excellent performance at any settings without ridiculous hardware, it's almost enough to throw me into an existential crisis about my priorities lol
I think it's easy to forget that going from 1080p to 4K is not 2 times, but 4 times more demanding in terms of raw pixel count, not to mention all the extra overhead on the memory sub system to push all those pixels. Graphics cards haven't been delivering anywhere near the improvement generation over generation to match that increase in demand.

Think about it, if your current GPU does 1080p @ 60fps in whatever game, then going to 4K, you'd need a GPU literally 4 times faster. Not 30-40% increase, not even 100%, but a 300% performance increase to match the same fps!

Forget 8K gaming, just forget it.
 
If you factor in inflation or rather the diminishing buying power of the dollar you might find gpus are quite a bit cheaper than they were 5 years ago.
 
I think it's easy to forget that going from 1080p to 4K is not 2 times, but 4 times more demanding in terms of raw pixel count, not to mention all the extra overhead on the memory sub system to push all those pixels. Graphics cards haven't been delivering anywhere near the improvement generation over generation to match that increase in demand.
While it depends what we mean by that a little bit, games became way more complex over time (so maybe just a precision to your points, if you run twice as much of a game and want 4 time the pixels you need for some stuff 8 times the hardware performance).

A 4080 would probably play at 4k most games of the Pascal era really well, it seem to have 4x time or more the power in most things versus a 1080, only 2.3 or so the memory bandwith but the cache got much better.

Fp32 tflops (there is no numbers that a really good representation, specially if the architecture change so this is just one of many)

_980: 4.98 Tflops
1080: 8.87 Tflops
2080: 10.07 Tflops
3080: 29.77 Tflops
4080: 48.74 Tflops


The 4080 in 4k benchmark seem to be about 2.5-2.6x time a 2080 and around 4x time the 1080
 
Last edited:
Is anyone besides me worried about another phase of that? Also, now there is talk of AI causing price increases.
I just sold my 3080 and looking for another card (replacement/upgrade) - I have bad luck, too. I had an rx 580 years ago just before the crypto thing went big - and that card wasn't the best for mining but the value of it went way up. So, I am kinda scared to be in limbo here.
 
Is anyone besides me worried about another phase of that? Also, now there is talk of AI causing price increases.
All indication are that it will maybe be company not being aggressive on the msrp/models. The release date of the next generation of gaming card having to wait for data center Hopper replacement instead of being at the same time or a bit before type of impact on regular people GPU and not price going above MSRP/hard to find them new at the store type.

Unlike the previous one regular demand did plummet because PC demands was trending down since the mids 00s but also a lot of people bought in 2020-2022 and are set for a while and

) regular GPUs seem a bigger step down for AI training than crypto minning, the ability to pool their memory and the networking between them being more important as a workload, the infracstructure-software around the GPU mattering quite a bit and not just stacking the more you can.
) regular people ability to make money from AI training or inference is much more complicated than simply letting hashrate do its thing, unlike crypto demands come from niche specialist-company-academic often renting cloud time, you want 24 A100 for a week a couple of time type of things and only a single one for the prototyping before.

AI maybe prevented the price that has being going down from early 2023 to continue to go more down that it would have otherwise and pushed the 5000 series to Q1 2025 instead of Q4 2024, at least that seem to be the direction and we could have passed the peak of the demand already.
 
Last edited:
  • Like
Reactions: pavel
like this
All indication are that it will maybe be company not being aggressive on the msrp/models the gaming release date of the next generation having to wait for Hopper replacement instead of being at the same time or a bit before type of impact on regular people GPU and not price going above MSRP/hard to find them new at the store type.

Unlike the previous one regular demand did plummet because PC demands was trending down since the mids 00s but also a lot of people bought in 2020-2022 and are set for a while and

) regular GPUs seem a bigger step down for AI training than crypto minning, the ability to pool their memory and the networking between them being more important as a workload, the infracstructure-software around the GPU mattering quite a bit and not just stacking the more you can.
) regular people ability to make money from AI training or inference is much more complicated than simply letting hashrate do its thing, unlike crypto demands come from niche specialist-company-academic often renting cloud time, you want 24 A100 for a week a couple of time type of things and only a single one for the prototyping before.

AI maybe prevented the price that has being going down from early 2023 to continue to go more down that it would have otherwise and pushed the 5000 series to Q1 2025 instead of Q4 2024, at least that seem to be the direction and we could have passed the peak of the demand already.
Yeah I think the AI thing is fundamentally different than Cryptomining re:market forces. With crypto, as long as coin prices were high, any GPU with more than 4GB of mem was fair game and it's trivially easy to scale up by adding more GPUs. With AI there seems to be a much higher performance floor and ofc VRAM requirements are much higher.

The market risk I see from AI isn't the same scale as people like buying up hundreds of midrange GPUs for their ETH sweatshop, it's NVidia restricting availability/specs of top-end cards to push "Pro" users to the A-series, and AMD ignoring high-end gaming altogether to focus on midrange gaming and ultra-high-end datacenter.

I think it's easy to forget that going from 1080p to 4K is not 2 times, but 4 times more demanding in terms of raw pixel count, not to mention all the extra overhead on the memory sub system to push all those pixels. Graphics cards haven't been delivering anywhere near the improvement generation over generation to match that increase in demand.

Think about it, if your current GPU does 1080p @ 60fps in whatever game, then going to 4K, you'd need a GPU literally 4 times faster. Not 30-40% increase, not even 100%, but a 300% performance increase to match the same fps!

Forget 8K gaming, just forget it.
Yep, the pixel count is important to keep in mind. Having a photography background I think of it in terms of Megapixels, helps put things in perspective.
720P = 0.9MP
1080P/2K = 2.1MP
1440/2.5K = 3.7MP
3K = ~4.7-4.9MP (not fully standardized)
2160P/4K = 8.3MP
2880P/5K = 14.7MP
4320P/8K = 33.2MP(!)

That's a lot of pixels added by going up in rez. That puts a higher load on shader cores, memory subsystem, ROPs, TMUs, RT cores, the whole thing.

While it depends what we mean by that a little bit, games became way more complex over time (so maybe just a precision to your points, if you run twice as much of a game and want 4 time the pixels you need for some stuff 8 times the hardware performance).

A 4080 would probably play at 4k most games of the Pascal era really well, it seem to have 4x time or more the power in most things versus a 1080, only 2.3 or so the memory bandwith but the cache got much better.

Fp32 tflops (there is no numbers that a really good representation, specially if the architecture change so this is just one of many)

980: 4.98 Tflops
1080: 8.87 Tflops
2080: 10.07 Tflops
3080: 29.77 Tflops
4080: 48.74 Tflops


The 4080 in 4k benchmark seem to be about 2.5-2.6x time a 2080 and around 4x time the 1080
Calculating a correction factor can be helpful to make these kind of abstract comparisons work. Comparing Pascal to Turing FLOPS works well enough because the architecture is similar enough, same with Ampere to Ada- but not Turing to Ampere because the architecture changed more significantly. Back in 2020 I correctly estimated the performance of the entire leaked Ampere and Navi2 stacks based only on rumored shader counts/clocks calibrated against "3070 = 2080Ti" and "6900XT = 3090" so estimating perf based on theoretical compute throughout can be done, but it can also be misleading if that correction factor is ignored.
 
  • Like
Reactions: pavel
like this
Yeah I think the AI thing is fundamentally different than Cryptomining re:market forces. With crypto, as long as coin prices were high, any GPU with more than 4GB of mem was fair game and it's trivially easy to scale up by adding more GPUs. With AI there seems to be a much higher performance floor and ofc VRAM requirements are much higher.

The market risk I see from AI isn't the same scale as people like buying up hundreds of midrange GPUs for their ETH sweatshop, it's NVidia restricting availability/specs of top-end cards to push "Pro" users to the A-series, and AMD ignoring high-end gaming altogether to focus on midrange gaming and ultra-high-end datacenter.


Yep, the pixel count is important to keep in mind. Having a photography background I think of it in terms of Megapixels, helps put things in perspective.
720P = 0.9MP
1080P/2K = 2.1MP
1440/2.5K = 3.7MP
3K = ~4.7-4.9MP (not fully standardized)
2160P/4K = 8.3MP
2880P/5K = 14.7MP
4320P/8K = 33.2MP(!)

That's a lot of pixels added by going up in rez. That puts a higher load on shader cores, memory subsystem, ROPs, TMUs, RT cores, the whole thing.


Calculating a correction factor can be helpful to make these kind of abstract comparisons work. Comparing Pascal to Turing FLOPS works well enough because the architecture is similar enough, same with Ampere to Ada- but not Turing to Ampere because the architecture changed more significantly. Back in 2020 I correctly estimated the performance of the entire leaked Ampere and Navi2 stacks based only on rumored shader counts/clocks calibrated against "3070 = 2080Ti" and "6900XT = 3090" so estimating perf based on theoretical compute throughout can be done, but it can also be misleading if that correction factor is ignored.
I have a question about the MP you mentioned reference cameras.

Some cell phones list their cameras at 40MP and more.. are they taking photos at higher than 8K? I ask because you put an exclamation mark next to the 33.2MP even though, to the layman, it wouldn't seem that large.
 
Some cell phones list their cameras at 40MP and more.. are they taking photos at higher than 8K? I ask because you put an exclamation mark next to the 33.2MP even though, to the layman, it wouldn't seem that large.
Afaik cellphones don't use most of the pixels due to pixel binning. Phone camera sensors are too tiny to actually use that many pixels like a normal camera.
 
Afaik cellphones don't use most of the pixels due to pixel binning. Phone camera sensors are too tiny to actually use that many pixels like a normal camera.
Is that why when I zoom in 20x on my pixel 6 pro the pictures look like I drew them in watercolours? Lol
 
I remember looking for a R9 290X during the 2014 mining craze. MSRP/SEP was $549. I could not find one for less than $800. Was going to get an AMD video card for the first time, but cryptomining ruined it. Ended up with a pair of GTX 780 reference models purchased at MSRP instead.
 
Is that why when I zoom in 20x on my pixel 6 pro the pictures look like I drew them in watercolours? Lol

Optical vs digital zoom

Digital zoom is cheaper and looks like shit

Optical zoom actually magnifies the image you're seeing using the lens, digital zoom just stretches out the jpeg essentially
 
Last edited:
All phones are bad cameras rigged up with software to make them look passable in post.

Yeah I get people send me amazing looking phone pics and then you zoom in just a little and...blotchy mush!

My 2013 Canon 6D...just keep zooming in and seeing the detail...
 
All indication are that it will maybe be company not being aggressive on the msrp/models the gaming release date of the next generation having to wait for Hopper replacement instead of being at the same time or a bit before type of impact on regular people GPU and not price going above MSRP/hard to find them new at the store type.

Unlike the previous one regular demand did plummet because PC demands was trending down since the mids 00s but also a lot of people bought in 2020-2022 and are set for a while and

) regular GPUs seem a bigger step down for AI training than crypto minning, the ability to pool their memory and the networking between them being more important as a workload, the infracstructure-software around the GPU mattering quite a bit and not just stacking the more you can.
) regular people ability to make money from AI training or inference is much more complicated than simply letting hashrate do its thing, unlike crypto demands come from niche specialist-company-academic often renting cloud time, you want 24 A100 for a week a couple of time type of things and only a single one for the prototyping before.

AI maybe prevented the price that has being going down from early 2023 to continue to go more down that it would have otherwise and pushed the 5000 series to Q1 2025 instead of Q4 2024, at least that seem to be the direction and we could have passed the peak of the demand already.
I hope you are right. The way it is going, AI looks like it will explode - and although, I think that is bad - I am starting to incorporate my 'conditions' of a gpu to include potential AI needs - already, I want high vram - 16gb min., somewhat high bus - but whether it would be half decent for AI too - although, that's more a 'bonus' than requirement.

But, it sounds like AI needs - are higher vram - and it seems that nvidia and amd are both interested - so, this shouldn't be a 'nvidia is better for this' or 'amd is...' - either can be.

In Crypto, nvidia cards were often sought out more than AMD cards - thus, their prices were especially going insanely high.
 
I hope you are right. The way it is going, AI looks like it will explode - and although, I think that is bad - I am starting to incorporate my 'conditions' of a gpu to include potential AI needs - already, I want high vram - 16gb min., somewhat high bus - but whether it would be half decent for AI too - although, that's more a 'bonus' than requirement.

But, it sounds like AI needs - are higher vram - and it seems that nvidia and amd are both interested - so, this shouldn't be a 'nvidia is better for this' or 'amd is...' - either can be.

In Crypto, nvidia cards were often sought out more than AMD cards - thus, their prices were especially going insanely high.

Why worry about supposed AI capability in a card right now? Very realistically both companies will keep benefits of AI limited to their professional series of cards. Plus you need to run something that needs it.

AMD cards had lower bandwidth on the newer cards and thus why people wanted the Nvidia ones more. Also why many mining cards have serious issues with their memory chips as they been ran way past spec and likely damaged from heat and voltage. In the early days of crypto they wanted the 290X, I was lucky to get one early into the launch, damn good card.
 
Why worry about supposed AI capability in a card right now? Very realistically both companies will keep benefits of AI limited to their professional series of cards. Plus you need to run something that needs it.

AMD cards had lower bandwidth on the newer cards and thus why people wanted the Nvidia ones more. Also why many mining cards have serious issues with their memory chips as they been ran way past spec and likely damaged from heat and voltage. In the early days of crypto they wanted the 290X, I was lucky to get one early into the launch, damn good card.
The problem is R&D and manufacturing resources will be aimed toward AI development and deployment, so nVidia will not be producing GPUs aimed at the desktop gaming segment, or at least not as much as n the past. AMD can step in and fill a void, but I don't trust them to advance the field, they don't have the R&D resources.
 
The problem is R&D and manufacturing resources will be aimed toward AI development and deployment, so nVidia will not be producing GPUs aimed at the desktop gaming segment, or at least not as much as n the past. AMD can step in and fill a void, but I don't trust them to advance the field, they don't have the R&D resources.

They don't have enough money/money for enough capacity to produce enough to fill the entire actual void even

They can just do what they can
 
I got into crypto when you could still mine bitcoin on a gpu and create your own pool. Bitcoin was worth roughly $80. Those days felt like the wild west compared to today. To go back in time and just buy those extra blades I should have gotten.
 
I remember paying $600-$700 for a MSI 1080 gaming x thinking it was insane. Now looking at 4090 price I would gladly pay $800 back then.
 
I was able to acquire a bunch of RTX gpu's at retail prices during the mining craze.
IMG_1627.JPEG


IMG_0411.JPEG
 
I remember paying $600-$700 for a MSI 1080 gaming x thinking it was insane. Now looking at 4090 price I would gladly pay $800 back then.
I bought my EVGA 1080 Ti FTW3 for $675 in 2017 and sold it in 2021 for $650.
 
Back
Top