Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
The supply chain problems were far worse than the crypto mining. It's just that crypto is a convenient punching bag for some.
What? Increased prices are detrimental to everyone on the consumer side, whether you were mining or not. The question wasn't if mining overall was detrimental to you.detrimental is a word used by those who didn´t make money with their GPU.
Honestly, staying at 1080P/1200P seems like a really good way to keep reasonable expectations and enjoy gaming without spending much money. I love how games look on my 4K panel when I really get the visuals dialed in but gods, the amount of money I've poured into this rig and the amount of time I spend tuning gfx and upscaling to get things looking & performing well...Paid $120 for my 6600XT used (no mining) a couple of months ago, sold my old 960XT 4GB for $40.. same as what I paid for it used in 2018 or so.. no complaints at all.
There have been huge advancements in CPU and GPU capability, along with overall inflation in everything else..
I buy / trade value (and often used), not the latest and greatest / top tier.. This always works out best in the end for me... but I am a very occasional gamer (only running dual 1920x1200 monitors).. to each their own.
If you can resist the hype train, appreciate what you have, and let the market come to you instead of chasing performance like a heroin addict, your wallet will remain fatter. This is a personal discipline choice for me, and not money.. I can afford anything I want, but spending on other things are more important to me.
I think it's easy to forget that going from 1080p to 4K is not 2 times, but 4 times more demanding in terms of raw pixel count, not to mention all the extra overhead on the memory sub system to push all those pixels. Graphics cards haven't been delivering anywhere near the improvement generation over generation to match that increase in demand.Honestly, staying at 1080P/1200P seems like a really good way to keep reasonable expectations and enjoy gaming without spending much money. I love how games look on my 4K panel when I really get the visuals dialed in but gods, the amount of money I've poured into this rig and the amount of time I spend tuning gfx and upscaling to get things looking & performing well...
Occasionally I'll play something at 1080P on a secondary machine or my laptop and am astounded how easy it is to get excellent performance at any settings without ridiculous hardware, it's almost enough to throw me into an existential crisis about my priorities lol
While it depends what we mean by that a little bit, games became way more complex over time (so maybe just a precision to your points, if you run twice as much of a game and want 4 time the pixels you need for some stuff 8 times the hardware performance).I think it's easy to forget that going from 1080p to 4K is not 2 times, but 4 times more demanding in terms of raw pixel count, not to mention all the extra overhead on the memory sub system to push all those pixels. Graphics cards haven't been delivering anywhere near the improvement generation over generation to match that increase in demand.
All indication are that it will maybe be company not being aggressive on the msrp/models. The release date of the next generation of gaming card having to wait for data center Hopper replacement instead of being at the same time or a bit before type of impact on regular people GPU and not price going above MSRP/hard to find them new at the store type.Is anyone besides me worried about another phase of that? Also, now there is talk of AI causing price increases.
Yeah I think the AI thing is fundamentally different than Cryptomining re:market forces. With crypto, as long as coin prices were high, any GPU with more than 4GB of mem was fair game and it's trivially easy to scale up by adding more GPUs. With AI there seems to be a much higher performance floor and ofc VRAM requirements are much higher.All indication are that it will maybe be company not being aggressive on the msrp/models the gaming release date of the next generation having to wait for Hopper replacement instead of being at the same time or a bit before type of impact on regular people GPU and not price going above MSRP/hard to find them new at the store type.
Unlike the previous one regular demand did plummet because PC demands was trending down since the mids 00s but also a lot of people bought in 2020-2022 and are set for a while and
) regular GPUs seem a bigger step down for AI training than crypto minning, the ability to pool their memory and the networking between them being more important as a workload, the infracstructure-software around the GPU mattering quite a bit and not just stacking the more you can.
) regular people ability to make money from AI training or inference is much more complicated than simply letting hashrate do its thing, unlike crypto demands come from niche specialist-company-academic often renting cloud time, you want 24 A100 for a week a couple of time type of things and only a single one for the prototyping before.
AI maybe prevented the price that has being going down from early 2023 to continue to go more down that it would have otherwise and pushed the 5000 series to Q1 2025 instead of Q4 2024, at least that seem to be the direction and we could have passed the peak of the demand already.
Yep, the pixel count is important to keep in mind. Having a photography background I think of it in terms of Megapixels, helps put things in perspective.I think it's easy to forget that going from 1080p to 4K is not 2 times, but 4 times more demanding in terms of raw pixel count, not to mention all the extra overhead on the memory sub system to push all those pixels. Graphics cards haven't been delivering anywhere near the improvement generation over generation to match that increase in demand.
Think about it, if your current GPU does 1080p @ 60fps in whatever game, then going to 4K, you'd need a GPU literally 4 times faster. Not 30-40% increase, not even 100%, but a 300% performance increase to match the same fps!
Forget 8K gaming, just forget it.
Calculating a correction factor can be helpful to make these kind of abstract comparisons work. Comparing Pascal to Turing FLOPS works well enough because the architecture is similar enough, same with Ampere to Ada- but not Turing to Ampere because the architecture changed more significantly. Back in 2020 I correctly estimated the performance of the entire leaked Ampere and Navi2 stacks based only on rumored shader counts/clocks calibrated against "3070 = 2080Ti" and "6900XT = 3090" so estimating perf based on theoretical compute throughout can be done, but it can also be misleading if that correction factor is ignored.While it depends what we mean by that a little bit, games became way more complex over time (so maybe just a precision to your points, if you run twice as much of a game and want 4 time the pixels you need for some stuff 8 times the hardware performance).
A 4080 would probably play at 4k most games of the Pascal era really well, it seem to have 4x time or more the power in most things versus a 1080, only 2.3 or so the memory bandwith but the cache got much better.
Fp32 tflops (there is no numbers that a really good representation, specially if the architecture change so this is just one of many)
980: 4.98 Tflops
1080: 8.87 Tflops
2080: 10.07 Tflops
3080: 29.77 Tflops
4080: 48.74 Tflops
The 4080 in 4k benchmark seem to be about 2.5-2.6x time a 2080 and around 4x time the 1080
I have a question about the MP you mentioned reference cameras.Yeah I think the AI thing is fundamentally different than Cryptomining re:market forces. With crypto, as long as coin prices were high, any GPU with more than 4GB of mem was fair game and it's trivially easy to scale up by adding more GPUs. With AI there seems to be a much higher performance floor and ofc VRAM requirements are much higher.
The market risk I see from AI isn't the same scale as people like buying up hundreds of midrange GPUs for their ETH sweatshop, it's NVidia restricting availability/specs of top-end cards to push "Pro" users to the A-series, and AMD ignoring high-end gaming altogether to focus on midrange gaming and ultra-high-end datacenter.
Yep, the pixel count is important to keep in mind. Having a photography background I think of it in terms of Megapixels, helps put things in perspective.
720P = 0.9MP
1080P/2K = 2.1MP
1440/2.5K = 3.7MP
3K = ~4.7-4.9MP (not fully standardized)
2160P/4K = 8.3MP
2880P/5K = 14.7MP
4320P/8K = 33.2MP(!)
That's a lot of pixels added by going up in rez. That puts a higher load on shader cores, memory subsystem, ROPs, TMUs, RT cores, the whole thing.
Calculating a correction factor can be helpful to make these kind of abstract comparisons work. Comparing Pascal to Turing FLOPS works well enough because the architecture is similar enough, same with Ampere to Ada- but not Turing to Ampere because the architecture changed more significantly. Back in 2020 I correctly estimated the performance of the entire leaked Ampere and Navi2 stacks based only on rumored shader counts/clocks calibrated against "3070 = 2080Ti" and "6900XT = 3090" so estimating perf based on theoretical compute throughout can be done, but it can also be misleading if that correction factor is ignored.
Afaik cellphones don't use most of the pixels due to pixel binning. Phone camera sensors are too tiny to actually use that many pixels like a normal camera.Some cell phones list their cameras at 40MP and more.. are they taking photos at higher than 8K? I ask because you put an exclamation mark next to the 33.2MP even though, to the layman, it wouldn't seem that large.
Is that why when I zoom in 20x on my pixel 6 pro the pictures look like I drew them in watercolours? LolAfaik cellphones don't use most of the pixels due to pixel binning. Phone camera sensors are too tiny to actually use that many pixels like a normal camera.
All phones are bad cameras rigged up with software to make them look passable in post.Is that why when I zoom in 20x on my pixel 6 pro the pictures look like I drew them in watercolours? Lol
Is that why when I zoom in 20x on my pixel 6 pro the pictures look like I drew them in watercolours? Lol
Part of the supply chain problem perhaps? Because guys buying pallets of GPUs can put quite a dent in your chain of supply,I take it you are/were a crypto miner?
All phones are bad cameras rigged up with software to make them look passable in post.
I hope you are right. The way it is going, AI looks like it will explode - and although, I think that is bad - I am starting to incorporate my 'conditions' of a gpu to include potential AI needs - already, I want high vram - 16gb min., somewhat high bus - but whether it would be half decent for AI too - although, that's more a 'bonus' than requirement.All indication are that it will maybe be company not being aggressive on the msrp/models the gaming release date of the next generation having to wait for Hopper replacement instead of being at the same time or a bit before type of impact on regular people GPU and not price going above MSRP/hard to find them new at the store type.
Unlike the previous one regular demand did plummet because PC demands was trending down since the mids 00s but also a lot of people bought in 2020-2022 and are set for a while and
) regular GPUs seem a bigger step down for AI training than crypto minning, the ability to pool their memory and the networking between them being more important as a workload, the infracstructure-software around the GPU mattering quite a bit and not just stacking the more you can.
) regular people ability to make money from AI training or inference is much more complicated than simply letting hashrate do its thing, unlike crypto demands come from niche specialist-company-academic often renting cloud time, you want 24 A100 for a week a couple of time type of things and only a single one for the prototyping before.
AI maybe prevented the price that has being going down from early 2023 to continue to go more down that it would have otherwise and pushed the 5000 series to Q1 2025 instead of Q4 2024, at least that seem to be the direction and we could have passed the peak of the demand already.
I hope you are right. The way it is going, AI looks like it will explode - and although, I think that is bad - I am starting to incorporate my 'conditions' of a gpu to include potential AI needs - already, I want high vram - 16gb min., somewhat high bus - but whether it would be half decent for AI too - although, that's more a 'bonus' than requirement.
But, it sounds like AI needs - are higher vram - and it seems that nvidia and amd are both interested - so, this shouldn't be a 'nvidia is better for this' or 'amd is...' - either can be.
In Crypto, nvidia cards were often sought out more than AMD cards - thus, their prices were especially going insanely high.
The problem is R&D and manufacturing resources will be aimed toward AI development and deployment, so nVidia will not be producing GPUs aimed at the desktop gaming segment, or at least not as much as n the past. AMD can step in and fill a void, but I don't trust them to advance the field, they don't have the R&D resources.Why worry about supposed AI capability in a card right now? Very realistically both companies will keep benefits of AI limited to their professional series of cards. Plus you need to run something that needs it.
AMD cards had lower bandwidth on the newer cards and thus why people wanted the Nvidia ones more. Also why many mining cards have serious issues with their memory chips as they been ran way past spec and likely damaged from heat and voltage. In the early days of crypto they wanted the 290X, I was lucky to get one early into the launch, damn good card.
The problem is R&D and manufacturing resources will be aimed toward AI development and deployment, so nVidia will not be producing GPUs aimed at the desktop gaming segment, or at least not as much as n the past. AMD can step in and fill a void, but I don't trust them to advance the field, they don't have the R&D resources.
I bought my EVGA 1080 Ti FTW3 for $675 in 2017 and sold it in 2021 for $650.I remember paying $600-$700 for a MSI 1080 gaming x thinking it was insane. Now looking at 4090 price I would gladly pay $800 back then.