Upgrading from 9700K to 12600K/12700K worth it?

RPGWiZaRD

[H]ard|Gawd
Joined
Jan 24, 2009
Messages
1,217
I'm going back n forth on this one, my biggest reason as to why now would be in order to keep using my current Samsung B-die 4x8GB (32GB) DDR4-3200 CL14 I run at DDR3400 CL14 currently (a relatively safe 1.385v though, some run these mf at 1.45v at which it probably would do DDR4-3600'ish) until DDR5 becomes more affordable, I really wanna max out the amazing deal I've had out of these kits as I've purchased them when it was the cheapest around 122€ or so in europe, should translate to $125 or so in the US market, per 16GB kit.

That leaves me at Alder Lake upgrade, no upcoming DDR4 support from AMD on the cards and likely I don't see future Intel generations supporting DDR4 either.

I'm a little torn between 12600K 309€ / KF 289€ and 12700K 449€ / KF 429€ respectively. (I don't need the integrated GPU so I'm little bit leaning towards KF unless there's some point other than due to convenience in GPU issues resolving cases when I can stick some old GPU into the computer eitherway).

Performance cores are 6x on 12600K and 8x on 12700K so I guess the logical choice would be 12700K even if the price jump is quite a bit. I even went from a 8600K to 9700K and thought the upgrade was noticeable / worth it as I make daily 3-5 min video renderings in After Effects which would cut down rendering times down from like 30 - 35 mins to say 20 - 25 mins which felt very nice. I guess 12700K is the only reasonable option and 12600K shouldn't even be in the consideration due to 6x performance cores and the budget cores being that much less fast, when speaking from video encoding point of view? Obviously I will use the computer for gaming as well but cutting the rendering times are the thing that would be most attractive aspect to me.
 
From someone who uses the 12700K on a daily basis that CPU gets my vote. I've used mine with DDR4 3600 and it does everything I need it to do while mostly gaming. I've used a Noctua NH-U12A and now a Z73 Kraken and in both instances it stays nice and cool as well.
 
Wait til next gen.
Will next gen support DDR4? Or are we talkin as in Alder Lake "refresh" that I suppose also supports current Z690 highly likely.

The whole purpose of the thread / upgrade rash is to be able to keep the current DDR4-3200 CL14 32GB Samsung B-die kit that I got when the DDR4 RAM pricing was the lowest possible so I wanna capitalize on that for a little longer. If I buy some DDR5 now it feels I will throw some $100 into the bin on non-optimal pricing of DDR5 gen due to high demand & limited supply + new-toy premium tax if I buy something in the DDR5 space that is equal to what Samsung B-die was for DDR4. Without having checked pricing on current DDR5 offering I guess that will quickly go into $350 range for a 32GB kit for something semi-decent.

I will likely keep the system if I'd upgrade now around 3 years so would either have to switch motherboard n stuff when time comes and then DDR5 market should have matured a lot too.
 
If keeping the ddr4 is the main goal then look for a 10850k/10900k or 11900k to keep the ddr4. Those will give great performance over a 9700k but it's not the current gen.
 
If keeping the ddr4 is the main goal then look for a 10850k/10900k or 11900k to keep the ddr4. Those will give great performance over a 9700k but it's not the current gen.

I don't agree with that at all. The jump from 10/11 gen to 12 gen is larger than the jump from 9th gen to 10/11 gen performance wise. It makes no sense to buy a 10th/11th gen part since you'll need a new motherboard anyway. Might as well jump to the 12th gen which is a significant improvement.

I would get either a 12600KF or a 12700KF if you want to upgrade. At the very least you "might" be able to slide into a Raptor Lake CPU in the future also, but at this point it is unclear if Intel is planning on DDR4 memory controller onboard 13th gen. At the very least you have a last hurrah for DDR4 with that upgrade.
 
I feel like upgrading to ADL strictly to be able to use your $250 of DDR4 is...dubious. You're looking at 350 for the CPU and 250+ for a good board, might as well sell the DDR4 and eat the 100 bucks to get DDR5, which will last you through your next decade's worth of builds. If you're going after fastest renders maybe AMD is the answer - the 5950x's 16 cores can match or beat the 12900K at workloads that scale to 16 cores, and are a lot easier to cool; stock 5950x benchmarks are done at 142W whereas stock 12900K benchmarks are done at 250W.
 
I'm going back n forth on this one, my biggest reason as to why now would be in order to keep using my current Samsung B-die 4x8GB (32GB) DDR4-3200 CL14 I run at DDR3400 CL14 currently (a relatively safe 1.385v though, some run these mf at 1.45v at which it probably would do DDR4-3600'ish) until DDR5 becomes more affordable, I really wanna max out the amazing deal I've had out of these kits as I've purchased them when it was the cheapest around 122€ or so in europe, should translate to $125 or so in the US market, per 16GB kit.

That leaves me at Alder Lake upgrade, no upcoming DDR4 support from AMD on the cards and likely I don't see future Intel generations supporting DDR4 either.

I'm a little torn between 12600K 309€ / KF 289€ and 12700K 449€ / KF 429€ respectively. (I don't need the integrated GPU so I'm little bit leaning towards KF unless there's some point other than due to convenience in GPU issues resolving cases when I can stick some old GPU into the computer eitherway).

Performance cores are 6x on 12600K and 8x on 12700K so I guess the logical choice would be 12700K even if the price jump is quite a bit. I even went from a 8600K to 9700K and thought the upgrade was noticeable / worth it as I make daily 3-5 min video renderings in After Effects which would cut down rendering times down from like 30 - 35 mins to say 20 - 25 mins which felt very nice. I guess 12700K is the only reasonable option and 12600K shouldn't even be in the consideration due to 6x performance cores and the budget cores being that much less fast, when speaking from video encoding point of view? Obviously I will use the computer for gaming as well but cutting the rendering times are the thing that would be most attractive aspect to me.
The 12600K basically doubles the cinibench multicore score of the 9700K.

Since you don't care about Intel's XE graphics and Quicksync: I would look at the 12700F.
 
I don't agree with that at all. The jump from 10/11 gen to 12 gen is larger than the jump from 9th gen to 10/11 gen performance wise. It makes no sense to buy a 10th/11th gen part since you'll need a new motherboard anyway. Might as well jump to the 12th gen which is a significant improvement.

I would get either a 12600KF or a 12700KF if you want to upgrade. At the very least you "might" be able to slide into a Raptor Lake CPU in the future also, but at this point it is unclear if Intel is planning on DDR4 memory controller onboard 13th gen. At the very least you have a last hurrah for DDR4 with that upgrade.

So you are saying going from a 8 core with 8 thread cpu to a 10 core 20 thread cpu that turbo's over 5 ghz is not a upgrade? Well shut my mouth and fill it full of corn pun........ I would have never thought more cores and more threads was not better.
 
So you are saying going from a 8 core with 8 thread cpu to a 10 core 20 thread cpu that turbo's over 5 ghz is not a upgrade? Well shut my mouth and fill it full of corn pun........ I would have never thought more cores and more threads was not better.

Its a bigger upgrade to move to a 12th gen CPU when you have to buy a new motherboard anyway. 8 Alder lake cores is significantly better than 10 rehashed coffee lake cores. 6 12th gen cores with the 4 efficiency cores is still a better upgrade than a 10C/20T Comet Lake CPU.
 
Last edited:
I'm going back n forth on this one, my biggest reason as to why now would be in order to keep using my current Samsung B-die 4x8GB (32GB) DDR4-3200 CL14 I run at DDR3400 CL14 currently (a relatively safe 1.385v though, some run these mf at 1.45v at which it probably would do DDR4-3600'ish) until DDR5 becomes more affordable, I really wanna max out the amazing deal I've had out of these kits as I've purchased them when it was the cheapest around 122€ or so in europe, should translate to $125 or so in the US market, per 16GB kit.

That leaves me at Alder Lake upgrade, no upcoming DDR4 support from AMD on the cards and likely I don't see future Intel generations supporting DDR4 either.

I'm a little torn between 12600K 309€ / KF 289€ and 12700K 449€ / KF 429€ respectively. (I don't need the integrated GPU so I'm little bit leaning towards KF unless there's some point other than due to convenience in GPU issues resolving cases when I can stick some old GPU into the computer eitherway).

Performance cores are 6x on 12600K and 8x on 12700K so I guess the logical choice would be 12700K even if the price jump is quite a bit. I even went from a 8600K to 9700K and thought the upgrade was noticeable / worth it as I make daily 3-5 min video renderings in After Effects which would cut down rendering times down from like 30 - 35 mins to say 20 - 25 mins which felt very nice. I guess 12700K is the only reasonable option and 12600K shouldn't even be in the consideration due to 6x performance cores and the budget cores being that much less fast, when speaking from video encoding point of view? Obviously I will use the computer for gaming as well but cutting the rendering times are the thing that would be most attractive aspect to me.
You havent mentioned if you are suffering framerate/frametime issues.
If you arent, stick with your current CPU.

I'm on a clocked 10700K and have no plan to upgrade because everything is smooth as f. @ near 4K res pixels.
 
Last edited:
i went from 9700k to 12900k and it 100% worth it and this CPU should only get better as thing are able to use more cores. most games still dont use more then 4 or 6 threads right now.
 
i went from 9700k to 12900k and it 100% worth it and this CPU should only get better as thing are able to use more cores. most games still dont use more then 4 or 6 threads right now.
Yeah I'm waiting on RL/Zen 4 before I upgrade from the 9900K.
 
What do you need the upgrade for? It's hard to give an answer without knowing a requirement. Probably not, though?

I remember when the Nvidia 6000 series was released in like 2005. Shader model 3.0 was THE feature advantage talked about on the forums next to SLI. Gotta have it! Get the midrange card with sm3.0 rather than the higher end ATI card! But in practice, the performance and image quality was hardly were the improved price. It was mostly marketing pushing a feature that didn't really give much value.

Of course this is back when pure raster performance would dramatically increase year to year, so everybody was rushing to upgrade. Since performance no longer increases nearly as much year over year, all the more reason to hold on to what you have.
 
I remember when the Nvidia 6000 series was released in like 2005. Shader model 3.0 was THE feature advantage talked about on the forums next to SLI. Gotta have it! Get the midrange card with sm3.0 rather than the higher end ATI card! But in practice, the performance and image quality was hardly were the improved price. It was mostly marketing pushing a feature that didn't really give much value.
WAT? Didn't give much value? Mad bro? o_O

GeForce 6000 series were most supported cards in the history of mankind bar original VGA cards.
Radeon Xxxx cards in comparison got obsolete extremely quickly. Yes, and they could not even run Crysis on them 🤣

PS3/X360 console generation having SM 3.0 compatible GPU's was big reason why this was the case. Console hardware have big impact on PC.
Also whole Vista/DX10 debuckle didn't help.
DX10 was harder to program and had slower performance so it made DX9c API of choice for many smaller developers. It is still well supported API making GeForce 6xxx still somewhat relevant.

And yeah, who would want to play games on 6800GT few years after its release... with CRT that came with it you could play console ports (and other games made on those engines) and definitely a lot of 'indie' games at 640x480/800x600 or even 1024x768 just fine with reasonable framerates :)

So yeah, these GeForce cards were much better buy long term. Short term second hand prices for SM 2.0 cards plummeted pretty quickly. I remember a lot of these cards on auctions but for prices comparable to GF people didn't want them. Can it run Crysis?... no? then GTFO :>
 
Yeah I'm waiting on RL/Zen 4 before I upgrade from the 9900K.
Makes sense. Best to not get first gen of new stuff.
They will resolve bugs, improve efficiency, etc.

Also 9900K is kinda overpowered as it is.
I was thinking recently about upgrade having option to get cheap Ryzen 5800x but is kinda pointless. I would gain absolutely nothing except even less readable CPU utilization charts in Task Manager and even less reason to upgrade in the future :p
 
Makes sense. Best to not get first gen of new stuff.
They will resolve bugs, improve efficiency, etc.

Also 9900K is kinda overpowered as it is.
I was thinking recently about upgrade having option to get cheap Ryzen 5800x but is kinda pointless. I would gain absolutely nothing except even less readable CPU utilization charts in Task Manager and even less reason to upgrade in the future :p
Yeah it's still a great CPU. I'm not itching to upgrade really. I'd buy a 40xx series card before cpu/mobo.
 
WAT? Didn't give much value? Mad bro? o_O

GeForce 6000 series were most supported cards in the history of mankind bar original VGA cards.
Radeon Xxxx cards in comparison got obsolete extremely quickly. Yes, and they could not even run Crysis on them 🤣

PS3/X360 console generation having SM 3.0 compatible GPU's was big reason why this was the case. Console hardware have big impact on PC.
Also whole Vista/DX10 debuckle didn't help.
DX10 was harder to program and had slower performance so it made DX9c API of choice for many smaller developers. It is still well supported API making GeForce 6xxx still somewhat relevant.

And yeah, who would want to play games on 6800GT few years after its release... with CRT that came with it you could play console ports (and other games made on those engines) and definitely a lot of 'indie' games at 640x480/800x600 or even 1024x768 just fine with reasonable framerates :)

So yeah, these GeForce cards were much better buy long term. Short term second hand prices for SM 2.0 cards plummeted pretty quickly. I remember a lot of these cards on auctions but for prices comparable to GF people didn't want them. Can it run Crysis?... no? then GTFO :>
I bought a 7800gt and I distinctly remember a year later it couldn't run Crysis at 800x600 over 15fps. It slogged even in the menus, I was upset I wasted my money rather than getting the cheaper but just about as good x850xtpe. Maybe it was cheaper on AGP? I can't remember. But on the forums everybody was praising its sm3.0 advantage over the x850xtpe which it traded blows with game by game. I bought the kool-aid and learned from my mistake.
https://www.anandtech.com/show/1755/6

I can't find price history but I remember paying around $350-$400 for a 7800gt and the x850xtpe was in the $200-$300 range at the time. I also remember the screenshots showing the visual shader differences being negligible at best. It was better to have higher settings rather than lower performance with the sm3.0 visual benefits.

Anyways the point I was making is it's best to buy the hardware for your current usage. Even going back two decades the debates haven't changed. "You gotta have sm3.0 or your future games won't run" became obsolete as the 6800 cards couldn't run games at playable frame rates on high settings/resolution two years later, regardless of the sm3.0 visual benefits or the fear no games would run on an ATI card.
 
I bought a 7800gt and I distinctly remember a year later it couldn't run Crysis at 800x600 over 15fps. It slogged even in the menus, I was upset I wasted my money rather than getting the cheaper but just about as good x850xtpe. Maybe it was cheaper on AGP? I can't remember. But on the forums everybody was praising its sm3.0 advantage over the x850xtpe which it traded blows with game by game. I bought the kool-aid and learned from my mistake.
https://www.anandtech.com/show/1755/6

I can't find price history but I remember paying around $350-$400 for a 7800gt and the x850xtpe was in the $200-$300 range at the time. I also remember the screenshots showing the visual shader differences being negligible at best. It was better to have higher settings rather than lower performance with the sm3.0 visual benefits.

Anyways the point I was making is it's best to buy the hardware for your current usage. Even going back two decades the debates haven't changed. "You gotta have sm3.0 or your future games won't run" became obsolete as the 6800 cards couldn't run games at playable frame rates on high settings/resolution two years later, regardless of the sm3.0 visual benefits or the fear no games would run on an ATI card.
At 800x600 you would have been cpu-bound, rather than gpu-bound.
 
I bought a 7800gt and I distinctly remember a year later it couldn't run Crysis at 800x600 over 15fps. It slogged even in the menus, I was upset I wasted my money rather than getting the cheaper but just about as good x850xtpe. Maybe it was cheaper on AGP? I can't remember. But on the forums everybody was praising its sm3.0 advantage over the x850xtpe which it traded blows with game by game. I bought the kool-aid and learned from my mistake.
https://www.anandtech.com/show/1755/6

I can't find price history but I remember paying around $350-$400 for a 7800gt and the x850xtpe was in the $200-$300 range at the time. I also remember the screenshots showing the visual shader differences being negligible at best. It was better to have higher settings rather than lower performance with the sm3.0 visual benefits.

Anyways the point I was making is it's best to buy the hardware for your current usage. Even going back two decades the debates haven't changed. "You gotta have sm3.0 or your future games won't run" became obsolete as the 6800 cards couldn't run games at playable frame rates on high settings/resolution two years later, regardless of the sm3.0 visual benefits or the fear no games would run on an ATI card.
No GPU should be slow in the menus unless it has a HW problem or is way below sys reqs for the game.
 
I bought a 7800gt and I distinctly remember a year later it couldn't run Crysis at 800x600 over 15fps. It slogged even in the menus, I was upset I wasted my money rather than getting the cheaper but just about as good x850xtpe. Maybe it was cheaper on AGP? I can't remember. But on the forums everybody was praising its sm3.0 advantage over the x850xtpe which it traded blows with game by game. I bought the kool-aid and learned from my mistake.
https://www.anandtech.com/show/1755/6

I can't find price history but I remember paying around $350-$400 for a 7800gt and the x850xtpe was in the $200-$300 range at the time. I also remember the screenshots showing the visual shader differences being negligible at best. It was better to have higher settings rather than lower performance with the sm3.0 visual benefits.

Anyways the point I was making is it's best to buy the hardware for your current usage. Even going back two decades the debates haven't changed. "You gotta have sm3.0 or your future games won't run" became obsolete as the 6800 cards couldn't run games at playable frame rates on high settings/resolution two years later, regardless of the sm3.0 visual benefits or the fear no games would run on an ATI card.
7800GT was faster card than X850XT PE and all benchmarks confirm that. It was faster with 8 shaders more, 1 more vertex shader and 4 more TMU. It had slower clock but more OC potential than X850XT PE

Also X850 was previous generation card so obviously it was cheaper. It was quite fast and for its performance quite cheap. It was SM 2.0 so it had to be considerably cheaper to sell at all, especially at 7800 times.

ATI card to compete with 7800 series was x1800 and these cards had SM3.0 support. X1800 were strange cards with 16 shaders and looked like what X800 should be (then it would be on 110nm node and no 90nm hence slower clocks hence slower than what it was... but I am certain it could compete with 6800 Ultra quite well!) and was quickly superseded by X1900 which had 48 shaders and these cards were quite good. Even better in the future than 7900 series which were refreshed cards from Nvidia which they released as response for X1900 by shrinking node on 7800 core and upping clocks.

As for SM2.0 cards, there were other games you could not run at all like Bioshock, Rainbow Six Vegas, GTA IV, etc., all perfectly playable on 7800GT but would not run on "X850xtpe". I mentioned Crysis because it did not run Crysis... you know 😮‍💨😅

It might not be that relevant for anyone who sold their card before they were affected by this but these things also affect resell value. I remember that at first second hand prices for x800/x850 and 6800/7800 cards were similar but later (somewhere when we had 8-th series cards) these Radeons started falling in price to be much cheaper than 6800 cards which were holding price much better. Heck, 6800 cards are still more expensive 🙃

There is however one (or two, depends how one look at it) reason why X800/X850 might be better option than 6800/7800: Windows 98 and compatibility with older games.
By older games I mean games which do not support 32-bit rendering. Starting from 6-th generation Nvidia dropped dithering for 16-bit modes making them look terribly ugly. Radeons on other hand render 16-bit just fine and especially when MSAA is enabled this dithering looks absolutely amazing for 16-bit. Also Nvidia supported Windows 98 in their drivers only up to 6800 and 7-th gen cards were not supported anymore. So X850XT PE is hands down ultimate Windows 98 GPU. Unsurprisingly 6800 cards are more popular for Win98 builds even despite SM 3.0 being completely pointless for Win98 and lack of proper 16-bit support. Cannot win with ignorance :)
 
Back
Top