Is future-proofing real?

Quiz

Gawd
Joined
Aug 25, 2010
Messages
659
Is future-proofing a computer purchase by getting the best of the best at the time of purchase actually mean the computer will last longer without upgrades? For example, back in 2011/2012 when I built the PC in my sig, I was deciding between the 2500K, 2600K, and 3930K. I picked the 3930K and it is still going these days and the only reason I got it over the others mentioned is because I thought it would last longer. Had I picked the 2500K or 2600K, would they still be doing as well as the 3930K these days? I did however continue to upgrade the GPU over time with every generation but stopped with the GTX 1080 Ti.

So, is future-proofing real or is it just better to get something that aligns with your requirements regardless of how long you plan to keep it?
 
It's real in some cases. My 2600k was still viable for a couple of years later than a 2500k simply from the hyperthreading. It was time to move on for me once the Ryzen 3900x came along though.
 
Is future-proofing a computer purchase by getting the best of the best at the time of purchase actually mean the computer will last longer without upgrades? For example, back in 2011/2012 when I built the PC in my sig, I was deciding between the 2500K, 2600K, and 3930K. I picked the 3930K and it is still going these days and the only reason I got it over the others mentioned is because I thought it would last longer. Had I picked the 2500K or 2600K, would they still be doing as well as the 3930K these days? I did however continue to upgrade the GPU over time with every generation but stopped with the GTX 1080 Ti.

So, is future-proofing real or is it just better to get something that aligns with your requirements regardless of how long you plan to keep it?
Depends on requirements. MS Office, web etc... Ya you're fine. Latest AAA title and VR? No.
You are asking a sloppy question and you have already outlined your answer. Of course it is a sliding scale of $ spent and longevity but the "sweet-spot" is often hard to pick and a new tech can throw it all out the window depending on use case.
It also hurts that "tech" is a passion/hobby for so many that they will spend silly money to maintain their "tech supremacy" (e-peen) to allow themselves to live vicariously as an imaginary tech god as a proper life replacement.
 
I find it is always appropriate to choose the fastest CPU available for a given socket.

Having said that, most computer retirements I did were done because of lack of RAM expansion possibilities (unregistered RAM with 4 DIMM slots).
 
I am running an AMD x570, 5900X CPU, 3070, 1k power supply, and 64 gig of ram. With the exception of the video card I can see getting another two years out of this rig. I upgrade based of price drops for the top of the line previous generation. Looking forward to a 3090TI next year.
 
I am running an AMD x570, 5900X CPU, 3070, 1k power supply, and 64 gig of ram. With the exception of the video card I can see getting another two years out of this rig. I upgrade based of price drops for the top of the line previous generation. Looking forward to a 3090TI next year.
Why do you want a 3090ti next year? Surely there will be better cards available then, assuming continued technological progress.
 
Why do you want a 3090ti next year? Surely there will be better cards available then, assuming continued technological progress.
Yeah, better cards available now but I may be able to afford the used 3090TI by next year. Per the original topic I find that many times an older top of the line unit is of a better build quality and far less expensive than the newest latest greatest, not faster but still makes a good buy.
 
There's never really been anything truly future-proof over the decades of computing that I've witnessed; the most you can hope for is not unwittingly buying in at the wrong time, just before a major technology gets released, and buying into a platform with unusually long legs.

Case in point: my 4770K/Z87 build. I put it together due to needing better single-threaded performance over my aging Q6600 build at the time, but little did I know that just one year later, this little thing called NVMe would start shaking up the SSD market and being the ideal choice for boot drives in a few years. Z87 boards can't boot from NVMe without injecting appropriate boot modules into the UEFI and flashing the result to your motherboard, and even then, it's a bit janky.

As for the platform thing, let's compare two platforms for the same architecture: AM4, and sTR4 (really just workstation SP3). Both started on the original Zen microarchitecture that finally gave AMD another fighting chance against Intel after that whole Bulldozer/FX letdown.

Even though AMD didn't want to enable it originally, even the oldest AM4 boards can, theoretically, accept a Ryzen 5800X3D for massive performance gains over any Ryzen 1000 series, no new motherboard needed - just make sure you update your UEFI first while you've got the old CPU in there.

sTR4/X399 systems have no such luck, though. Threadripper was left hanging, with 1950X owners left to upgrade to maybe a 2990WX, and that's just Zen+, quite a bit short of Zen 3 in terms of single-threaded performance. At least you won't have to be left hurting for sheer PCIe lane count, I suppose.

Nobody could have predicted that disparity in platform support; if anything, AM4 is an anomaly given how Intel was conditioning everyone to replace their motherboards every two years by that time.

On the GPU side of things, it used to be that you couldn't even go two years without some major DirectX or OpenGL revision showing up that added new features that new games were eager to take advantage of, and sometimes even make part of their minimum requirements. Didn't take long for pixel/vertex shader support to become mandatory, then SM 3.0 (good for GeForce 6800 owners, not so good for Radeon X800 owners), then unified shaders in DX10/11, and now we're seeing this big push for real-time raytracing and even path tracing.

This isn't even taking into account new port standards; HDMI 2.1 and DisplayPort 2.0 are such massive leaps over prior versions that they're becoming a must if you want 4K 120 Hz with HDR or better.

Despite all of this advancement, I will say that unlike the '80s and '90s, it is perfectly possible to get by with a decade-old computer for casual Web browsing and Office documents these days, thanks to the advent of multiple cores and generally more computing resources than anyone knows what to do with along with the proliferation of SSDs, until all the modern Web bloat starts choking out older CPUs and platforms with limited RAM in general.
 
Depends what you mean by "future proof".

You can buy something that will let you play pretty much anything in the next couple years. It's not going to be at the highest settings or frame rates because there are always games targetting the current best hardware or even future best hardware for their highest settings.

If you buy the current highest end hardware you'll probably get a couple more years of being able to play games over buying mid range hardware, but you're not getting any extra years of highest end settings.

it's never a good deal. You spend $1000 on a CPU instead of $250, you could have just saved the $250 and spent another $250 5 years later for a better CPU.
So IMO no, you can't "future proof".
 
I think the last post show 2 differents things.

Can you future-proof, was it worth it financially-computer to spend the money to do it versus spending less 2 time with a refresh later on, that part will complicate thing.

Imagine buying a 3900x to future proof gaming thinking more than 8 core will become important soon, it will get beat by a cheap 5600x that cost less than the 3600x to 3900x price difference. The superbe high end motherboard for the extra life can have cost most than the cheaper option with a cheaper but better refresh later on.

Now it depend how much of a chore versus fun doing the upgrade are has well, not having to do it can be worth the extra money to some.

GPU tend to be a different thing, I am not sure there was ever a GPU too strong for now but worth it for the future type of things, game tend to always push them right now, if someone would have bought a 3090 thinking very little difference now with a 3080 but one day it will.... obvious case of not being worth it versus a 6800xt/3080 + refresh (say selling those for an xtx).

Computer can always last longer the more you spend on the purchase, that not really the interesting question, was it worth it his more where there something to thinking about. Certainly any cpu weaker than a 3930k would not be doing has well, that trivially true, would using the money save from a 2600k instead to upgrade to a 6700k years later would have beaten that 3930k by a lot.

There no rule, but usually it need to be already better now tend to rarely be a bad way to go, anyone that spent a lot on drive performance with them mattering a lot for video game in the past or directstorage for example (or worst bought the motherboard at an high price to support gen4 has well), maybe saved themselve making a copy drive step, but they would have a much better drive at a much better price now and it was never a big deal (almost yet to be to this day)
 
No one can predict the future. The best you can do is gamble/guess based on what is currently known.

Prior to my RTX 4080 I had a RTX 2080. Before I got the RTX 2080 I was really on the fence about whether to go with the 2080 or a 1080ti. At the time, both cards were pretty close to the same price and gave similar performance. 2080 had the edge on features but the 1080Ti had the edge on VRam (8Gb vs 11Gb). I ended up going with the 2080, and it's still going strong in my backup computer. Looking back, I think that I definitely made the right choice since the 2080 can do DLSS and hardware Ray Tracing, etc. Performance in benchmarks also seemed to favor the 2080 more over time, perhaps because they were focusing driver optimiziations on newer cards. Eventually, it will likely be supported with driver updates longer than the 1080ti also since there is a huge feature gap between cards, like the split between DirectX 9 and DirectX 10 cards.

The 2500K vs 2600K is a good example also. When the two came out there was basically zero reason to go with the 2600K. 4 Cores was plenty, almost nothing used more, so the hyperthreading on the 2600K seemed pointless. But today, for anyone still using those CPUs, there is a decent gap between them in many cases. Any game that uses more than 4 cores will definitly run faster on the 2600K, and even if your game only uses 4 cores, there just tends to be a lot more stuff running in the background these days that makes the extra virtual cores more valuable now. It can make a huge difference when software encoding/decoding video, etc.

I bought two 1000w PSUs back in 2007 or so. One is still going strong in my backup computer, the other was still in-use in my main computer up until I replaced it with a more efficient 850w a year or two ago (it still works). So those were great investments.

But sometimes trying to "future-proof" too much can just waste money. When you make bets about the future that don't pay off.

For the last 20 years I tended to go with CPUs that had more cores than I needed, starting with my Dual-Xeon system back in 2003 or so before dual-core CPUs were even a thing, and jumping on the quad-core bandwagon early, and then when I spent more to go with a HEDT system so I could get a 6-core 5820k instead of a mainstream quad-core, and then when I jumped on AM4 and went with a 12-core 3900X. In every case, by the time there was even one game out that actually made use of the number of cores I had, I had already upgraded again. The 5800X3D that I eventually upgraded to, being only 8-cores, is really the only semi-exception.

People have been obsessing about NVMe for years now even though there is still almost no difference between them and SATA SSDs due to technologies like Directstorage not being utilized yet.

People spent $$$ on the 3090ti only to see it get totally upstaged by the lowly 4070ti a year later, and the 3090ti can't even support DLSS3.

So it's great when you win the gamble, not so great when you lose. But if you feel confident about how things are going to play out, then go for it. In the end it's always better to have more hardware than you need than not enough.
 
I've been building PC's since the Celeron 300a & Moore's Law was a good basic rule RE: 2 years.

I started with 2nd tier / budget parts & added memory, upgraded CPU (mobo at times) & upgraded my vid card as prices dropped on tier 1.

The Slot A 450 / 600 was in there somewhere too...

During that time - I was building a new box every 1~2 years with affordable 2nd or 3rd tier from the top parts.

With 2 boys in the house, those PC's and upgrade parts basically "rolled downhill" every 1~2 years.

Eventually, all 3 PC's had STEAM + CS / TF / Left 4 Dead / Quake, etc. & went 3~4 years with just vid card upgrades.

Tech marched on & I bought a mish-mosh of used parts & made mediocre upgrades (mostly bigger hard drives).

I was making more money by that time & built a Q6600 w/ a tier 1 mobo, plenty of ram & All-In-Wonder 9800 PRO vid-card.

Suddenly - divorce & kids moved off to mom's house - gaming w/ the kids stagnated...

The lawyers made BANK - so that Q6600 ruled the roost for 4 years & only got some memory upgrades & HD's.

Then it got a Q9550 & R9-280 + 1080 monitor & it was GAME on again for 2 more years.

I also bought a mobo & cooler & repurposed an old case w/ Q6600 & AIW 9800 & gave that to the kids!

When I realized that I ran the Quad Cores for 5 years, I decided to buy tier one parts on a 10+ year plan.

Tier 1 mobo (CPU upgrade safeguard + USB galore), tier 1 CPU (i7-4790K), enough ram to run + upgradability, huge HD & reused R9-280 vid card.

^^^mobo & cpu died after 5 years - maybe PSU fried them? I was in an apartment building during a storm & power outage = POOF.

No power & no PC after that. OBVS, I did not have a UPS.

Lucky for me, I still had the Q9550 in a box & was back online & buying parts the next day!

Long story long....

I started building on a 2 year window / then a 3~4 year window / then a 5 year window & now maybe 7 years before I can afford to upgrade my RTX-3070.

I wasn't "future proofing" as much as I was budgeting for longevity via upgrades.
 
Last edited:
Is future-proofing a computer purchase by getting the best of the best at the time of purchase actually mean the computer will last longer without upgrades? For example, back in 2011/2012 when I built the PC in my sig, I was deciding between the 2500K, 2600K, and 3930K. I picked the 3930K and it is still going these days and the only reason I got it over the others mentioned is because I thought it would last longer. Had I picked the 2500K or 2600K, would they still be doing as well as the 3930K these days? I did however continue to upgrade the GPU over time with every generation but stopped with the GTX 1080 Ti.

So, is future-proofing real or is it just better to get something that aligns with your requirements regardless of how long you plan to keep it?
Generally speaking, the higher end you buy the less you'll need in terms of upgrades. The higher end parts are significantly faster than the bottom end of the product stack or even the mid-range in most cases. If you bought an RTX 3090, you are probably doing fine. If you had bought the OG 3080, you won't be doing as well given the VRAM limitations. Anyone who bought a GTX 1060 isn't having as good an experience in games today as people who bought GTX 1080 Ti's are.

Similarly, those of us who were on Core i7 5960X's got a lot of life out of our CPU's. I ran mine for about five and a half years. HEDT had really long legs with all its PCIe lanes and we had eight cores plus Hyperthreading. Intel only offered four core CPU's for a very long time and it wasn't until AMD released mainstream processors with eight cores that Intel finally had to get off its ass and release mainstream processors with higher core counts. But those first offerings weren't really all that enticing. I don't think anything short of the 9900K was even worth considering over the 5960X or 6950X.

Another example of this was SLI. Back in the day, if you had two high end cards in SLI you were experiencing next generation performance already. You could easily skip a generation on the high end if you wanted to.
 
Back
Top