Torn between 7700K or Ryzen 1700

letoaster

n00b
Joined
Mar 7, 2017
Messages
5
Hi guys,

Long time lurker, fist time poster. Love the quality of the articles and reviews here! I'm torn since the release of AMD new ryzen processors.

I'm a battelfield series player and recently, i feel that my cpu is holding me back when playing (100% utilisation while playing). I play at 1440p resolution only, on a 27' ultrasharp dell pannel.

Presently my system is :

3570k @ 4.6Ghz, running strong for 5 years.
16gb ram
z77 extreme 4 mobo
r9 280x 4gb card
xfx 850w psu

I'm debating on my next processor, between the new rezen 1700 or a 7700k.

Planing to use theses components:

Ryzen build:
Ryzen 1700 cpu
GA-AX370-GAMING 5
Corsair H115i Liquid CPU Cooler
Samsung 960 Evo 250GB M.2
Corsair Vengeance LPX 16GB (2 x 8GB) DDR4-3200 Memory

7700k build:
7700k cpu
GA-Z270X-Gaming 5
Corsair H115i Liquid CPU Cooler
Samsung 960 Evo 250GB M.2
Corsair Vengeance LPX 16GB (2 x 8GB) DDR4-3200 Memory

Obviously I am planing to overclock the cpus at their maximum potential. I expect the ryzen 1700 to run around 3.9 or 4Ghz.

Should i go for the 7700k @ aprox 5Ghz and call it a day or buy a ''future proof'' slower in single thread processor with more cores and threads? I play mostly BF1 multiplayer with 64 players.
 
For gaming i'd keep the i7 7700k.

If you need a workstation maybe the Ryzen mostly will benefit you.

BTW with that CPU you are still fine!
 
Your GPU holds you back more than your CPU does. At 2560x1440 or greater resolutions there is virtually no difference between all the CPUs from Sandy Bridge to Broadwell-E to Ryzen.
 
Your GPU holds you back more than your CPU does. At 2560x1440 or greater resolutions there is virtually no difference between all the CPUs from Sandy Bridge to Broadwell-E to Ryzen.

Indeed, i was planing to upgrade this summer and wait to see what AMD vega has to bring to the table. So my 3570k @ 4.6Ghz is still good for newest titles? I'm impressed
 
Neither of them will really give you the feeling of upgrading from whatever you had before the 3570k but it's easy to figure this out. If you spend more than 50% of your time at the keyboard gaming, go for the 7700.

Your problem is the 280 though. 1440p is fairly GPU intensive. The GTX 1070 would smooth things out more than the CPU upgrade.
 
Your GPU holds you back more than your CPU does. At 2560x1440 or greater resolutions there is virtually no difference between all the CPUs from Sandy Bridge to Broadwell-E to Ryzen.

You will just ignore the fact his CPU is running at 100% just to keep up with the GPU ?
Also that is not entirely true what you are saying in doom2016 OpenGL mode i ran into a CPU bottleneck in 2566x1600. changin to vulkan solved it due to more improved multicore distribution.


@OP
Without price tags it hard to recommend best upgrade.
 
You will just ignore the fact his CPU is running at 100% just to keep up with the GPU ?
Also that is not entirely true what you are saying in doom2016 OpenGL mode i ran into a CPU bottleneck in 2566x1600. changin to vulkan solved it due to more improved multicore distribution.


@OP
Without price tags it hard to recommend best upgrade.

I'm not completely ignoring it but the GPU is always a bigger upgrade than the CPU is for gaming from where he's at. The only time it isn't is at CPU limited resolutions. Admittedly, that's about where we are with 1080P gaming at this point.
 
I'm not completely ignoring it but the GPU is always a bigger upgrade than the CPU is for gaming from where he's at. The only time it isn't is at CPU limited resolutions. Admittedly, that's about where we are with 1080P gaming at this point.

I would have to disagree in this case. A GPU upgrade will not do you anything if you are constantly CPU bottle necked. Which he is with his CPU running at 100% constantly.
and no you are not just CPU limited in low resolutions. lower resolution move more load to the CPU to make it CPU limited but its not a hard cut line. older games with less graphics effect. can go a much higher resolutions and still be CPU limited because the workload is a lot different (aka a bigger load part on the CPU.

Resolutions is an indirect / rule of t humb thing. Looking at the actual load is what tells you the real deal.


What could confirm this is if OP could use CPU-z and see what his GPU load is at when gaming.
 
Indeed, i was planing to upgrade this summer and wait to see what AMD vega has to bring to the table. So my 3570k @ 4.6Ghz is still good for newest titles? I'm impressed

The dissapointment you would experience if you went from the [email protected] to a Ryzen could not possibly be packed into words. Make no mistake, the Ryzen is a good CPU, but it's main strength are not games. The Ryzen is a workstation CPU which can also double as a good but not perfect gaming CPU

The 7700k is king for games and is moderate as a workstation CPU, but even the 7700k is not much faster for games than your [email protected]. We are talking 10-15% more fps at best and only if you play very low resolutions. The 3570k is a beast CPU still even after so many years. Intel hasn't really given us any meaningful upgrade options. Moore's law for desktop CPUs is LOOOONG dead.

Your best upgrade option is a faster GPU as others mentioned already and yourself recognized. Waiting for the AMD vega is a smart move. It will lower the prices for the GTX 1080ti possibly or if the vega is better, by all means go with it instead of an NVIDIA.

The next upgrade which would make sense, if you like playing BF1 a lot, would be a 21:9 monitor, once there are 21:9 gaming monitors out with displayport 1.4 support, both the AMD VEGA and the gtx 1080ti support. This would allow for 3440x1440 resolutions at higher than 144hz. Currently those monitors seem to be limited at 100hz. Other 21:9 monitors with more than 100hz have a too low resolution for their size which makes them totally crap for any desktop use.

The 21:9 monitor is debatable however. Since you can set the fov in battlefield, i am not sure if having a giant 16:9 monitor or TV screen with a high refresh would be better than having a 21:9 screen. A lot of factors come into play here, including how close your position to your monitor is etc.



Finally, IMO, right time for you to upgrade your motherboard and CPU would probably be when 10nm CPUs are out. Not only because of more performance, but also because of a much lower power consumption i except them to have.
 
Last edited:
I would have to disagree in this case. A GPU upgrade will not do you anything if you are constantly CPU bottle necked. Which he is with his CPU running at 100% constantly.
and no you are not just CPU limited in low resolutions. lower resolution move more load to the CPU to make it CPU limited but its not a hard cut line. older games with less graphics effect. can go a much higher resolutions and still be CPU limited because the workload is a lot different (aka a bigger load part on the CPU.

Resolutions is an indirect / rule of t humb thing. Looking at the actual load is what tells you the real deal.


What could confirm this is if OP could use CPU-z and see what his GPU load is at when gaming.

here it is! 100% in cpu, and 99% in gpu. also included cpu z for reference
https://************/inRyLa
https://************/inRyLa
 
I would have to disagree in this case. A GPU upgrade will not do you anything if you are constantly CPU bottle necked. Which he is with his CPU running at 100% constantly.
and no you are not just CPU limited in low resolutions. lower resolution move more load to the CPU to make it CPU limited but its not a hard cut line. older games with less graphics effect. can go a much higher resolutions and still be CPU limited because the workload is a lot different (aka a bigger load part on the CPU.

Resolutions is an indirect / rule of t humb thing. Looking at the actual load is what tells you the real deal.


What could confirm this is if OP could use CPU-z and see what his GPU load is at when gaming.

Disagree all you like. People blow CPU bottle necking all out of proportion. Why do you think people are still getting solid performance out of the Core i5 2500K even with modern GPUs? How many benchmarks do people need to see before they realize that the CPU only matters in some pretty specific circumstances? Yes, at ultra-high resolutions it does make a difference. You are talking to someone that actually went to a lower resolution by switching from NVSurround at 7680x1600 to a 48" 4K (3840x2160) display. I've been using multiGPU systems at high resolution for years. I've seen how CPUs can make a difference on different ends of the spectrum. I would even agree with you if the OP had more than a mid-range GPU but he doesn't. I'm not saying a new CPU won't benefit the OP. What I'm saying is that a new GPU will improve his gaming performance at 2560x1440 more than a new CPU will.
 
Last edited:
here it is! 100% in cpu, and 99% in gpu. also included cpu z for reference
https://************/inRyLa
https://************/inRyLa
Thank you. Is all that CPU load from the Game process?


Disagree all you like. People blow CPU bottle necking all out of proportion. Why do you think people are still getting solid performance out of the Core i5 2500K even with modern GPUs? How many benchmarks do people need to see before they realize that the CPU only matters in some pretty specific circumstances? Yes, at ultra-high resolutions it does make a difference. You are talking to someone that actually went to a lower resolution by switching from NVSurround at 7680x1600 to a 48" 4K (3840x2160) display. I've been using multiGPU systems at high resolution for years. I've seen how CPUs can make a difference on different ends of the spectrum. I would even agree with you if the OP had more than a mid-range GPU but he doesn't. I'm not saying a new CPU won't benefit the OP. What I'm saying is that a new GPU will improve his gaming performance at 2560x1440 than a new CPU will.

Getting solid performanceo out of a CPU is not the same as not having a bottleneck.
most of these report on this on other frums has absolut no evidenciell support for it not beeing a bottleneck.. and is litte more than "My system run fine on a 2500k."
A system runing fine or "Solid" on a CPU is a subjetive measurement of beeing satisifed. beeing a bottlenekc is a objetive measure technical limitations. those tow things should not be confused.

Let me again lead to a real testing of Doom 2016 running 25600x1600. vsync on. no AA . in openGL mode it an its 60FPS most of the time but sometime it ditched down to 30FPS (natural for having vsync on)
checking the thread utilization it was clear that one of Doom'
A thread was hitting a 100% utilization of a core or 12.5% of a CPU (8 logical cores)
Changing the API to Vulkan resulted on no longer any thread hitting 100% utilization of a core. and no longer dit the FPS dip into 30FPS on the same spots as before.

By changing the system load to avoid a CPU bottleneck in core speed the bottleneck was removed and the GFX could again deliver its 60FPS
That was an i7 3770. with a gtx970
So yeas OPs CPU could easily with the information given at the time, be a bottleneck for the system and limit any benefit from upgrading the GPU, when there is simply no any resources left in the CPU to support any more FPS that a new GPU would be able to render.


Now that he has updated it with something that show bot 100% on CPU and 99% on the GP the picture might change
 

full disclosure I don't own BF1 so im unaware for any game specific reason why it would show this weird behavior.

But let me sum up some of the things i've picked up here and there

- BF1 has been know for many with just a quad core CPU to be bottleneck and fair way better on a CPU with 8 logical cores.
- Several people are reporting that they CPU 100% usage dropped when chaining to DX11 instead of DX212. you might want to see if that helps

most typical for ordinary games. When your CPU is hitting a close 100% overall usage, or one of it threads hits 100%/logical cores, you are hitting the limit of what the CPU/core can provide and GPU usage typically drops.
This would indicate a CPU bottleneck.
Reverse if you GPU usage is constanlty on 100% and you CU usage/thread usage is low you are GPU bottle necked.
You will most of the time be in either one or the other (There is always a bottleneck otherwise you would have infinite FPS)

Since you are hitting 100% total CPU as well as on the CPU usage. I cant draw a clear conclusion of what is optimal to upgrade.
But i would suggest to try out running in DX11 to see how the CPU usage fairs in that first . before starting to upgrade hardware.
If CPU usage suddenly drops a lot you might be better at looking at a new GPU over a CPU.
 
I'm running a stock 3570k not even overclocked and i had Doom maxed on Ultra at a solid 200fps with fast sync on a gtx 1070 using Vulkan. Tomb Raider the same about 140-160fps. In fact all the games i ran since i got my 1070 have been crushing upwards of 180fps at 900p. Even heavily modded Skyrim, which i never saw a solid 60fps before in that game and always heard it was cpu bottlenecked. Wrong. Got a gtx 1070, now if i don't cap frames in Skyrim i get up to 240fps in it with 4k textures, stupid high foliage, about 150 mods. Cpu bottleneck my ass.

That's a 3.8 stock boost with all power saving features enabled and a straight stock intel cooler. If i had kept my gtx 660 and upgraded my CPU i would have not gotten anything comparable like that for performance upgrade in those games :p The fact is he will see a huge improvement with just a new video card. And while BF multiplayer with 64 people looks to obviously seems to benefit from 8 cores, his major bottleneck is his GPU, that is stone cold fact.

I recommend you buy a new video card. It will afford you the luxury to bide your time figuring out your best course of action in upgrading the rest of your system and may even make you forget about it for a bit, unless the fresh stank of new gear has you by the nosehairs. But it did for me. I wouldn't bother with Ryzen at this current moment anyway. Way too much instability with the mobos if you can even get one. Let it get sorted first. Not sure 20+ years of pc gaming experience warrants being able to give that advice :|


Funny thing is we used to pay BF1942 on 64 player servers with single core systems and like 512 or 768MB of system ram. I think my HD was 19GB. Athlon K7, Geforce 2 Ultra. Good times.
 
Option C would be to drop in a 3770k

I'd agree with this if one could be found, and can relate from first-hand experience that the older non-HT CPUs are both slower and less smooth in the newest games, provided you have the GPU to push them at your desired resolution and refresh rate. Definitely recommend HT for gaming as it addresses the minimum framerates/frametimes that you actually 'feel'.

But since they cannot be found (I wanted to drop one in to replace what I thought was a dying 2500k), either the 7700k or Ryzen would be good. The 7700k is currently turning out faster for higher framerate stuff if that's what you need, but otherwise the Ryzen CPUs are the real deal.
 
I'd agree with this if one could be found, and can relate from first-hand experience that the older non-HT CPUs are both slower and less smooth in the newest games, provided you have the GPU to push them at your desired resolution and refresh rate. Definitely recommend HT for gaming as it addresses the minimum framerates/frametimes that you actually 'feel'.

But since they cannot be found (I wanted to drop one in to replace what I thought was a dying 2500k), either the 7700k or Ryzen would be good. The 7700k is currently turning out faster for higher framerate stuff if that's what you need, but otherwise the Ryzen CPUs are the real deal.

I cannot confirm this.

I own a 5960x. Turning HT-off in the bios, there is exactly zero difference in the fps i get in battlefield 4 compared to HT-on. I do get a higher fps count however when overclocking from 4ghz to 4.5ghz. With HT-off however, i can overclock higher without running into heat issues.

Another thing i experience with HT off, is lower DPC latency when using latencymon, even though the greatest reduction in DPC latency i get is when i turn off C-states in the BIOS, but that is not really an option because the idle powerconsumption for the CPU rockets to over 120w when overclocked, whereas with c-states on, i am around 40-50w when idle.
Maybe if was using my PC for low latency audio recording i would turn HT and C states off in the BIOS to make sure that my cores are always processing the signals at the min latency, vs letting the stupid OS/motherboard decide when to go enter the max state.

The reason why the 3770k might in some games result in higher fps vs a 3570k to me seems to be the higher cache and NOT HT itself. If it was HT, then my 5960x should show a difference in games between HT on and HT off, but it doesn't. HT off is even better possibly because of lower DPC latency.
Or maybe i have not come across any game yet which makes sufficient use of HT to show any differences. So i am not going to be absolute on this statement, but willing to put it to a test with my 5960x if anyone can point me to a demo benchmark i could run with both HT on and off at the same clock rates.

edit: In either case, OP is using a 280x GPU only, which is equivalent to a 7970 AMD GPU. A [email protected] is overkill for this GPU. He is definitely GPU limited in most scenarios except very low resolutions in games used for benchmarking only, and very very few games which make heavy use of the CPU.
In Battlefield 1, going from a 280x to a GTX 1080ti would more than double the average FPS count. Going from a [email protected] to a [email protected] you might end up with maybe 10-15fps more at best.
And if you end up with a lemon 7700k which does not overclock well, your fps might be even worse.


edit2: I found a video on youtube where someone tests his 5960x with HT on and HT off. With HT on it is overclocked to 4.5ghz, whereas with HT off he is at 4.6ghz, 100mhz higher.
That is realistic, because with HT off, you can always go a bit higher. 100mhz is actually quite conservative.



as you can see in this video, HT on or off makes almost no difference at all for neither the min fps nor the average fps.
In fact, in his last test, GTA V, HT off seems to beat HT on by a LARGE percentage, which is quite surprising as the gap is just too high. The reasons why HT off performs so much better on a 5960x in GTA V in this video, i cannot explain.

Maybe with HT on his CPU was overheating and downclocking but that is only a guess.


edit3: In the video you can also see the CPU temps while he is gaming. The CPU temps with HT on at 4.5ghz are higher than with HT off at 4.6ghz. Yet the fps is better with HT off still.
However, my initial suspicion that the CPU is throttling because of too high temps does not stand, as those CPU throttle around 90°C. (don't remember the exact value)
Maybe if he turned off C-states while doing this test, he would get different results in GTA V as it might be the C-states profile of his OS/motherboard which prevents HT on from at least getting close to the HT off results.
100mhz difference in clock rate certainly doesn't justify around 30% higher FPS with HT off.
 
Last edited:
Hyper threading has always (and continues) to be detrimental to gaming average framerates. Period. I really don't know why people just don't get this. Do your own tests, you'll see it immediately. Windows even boots faster!

However, ironically, in my own testing with GTAV in particular, whilst performance is/was superior with HT off, there was definitely less hitching/stuttering with HT on.

So I guess it depends on how you qualify your gaming experience...
 
Hyper threading has always (and continues) to be detrimental to gaming average framerates. Period. I really don't know why people just don't get this. Do your own tests, you'll see it immediately. Windows even boots faster!

However, ironically, in my own testing with GTAV in particular, whilst performance is/was superior with HT off, there was definitely less hitching/stuttering with HT on.

So I guess it depends on how you qualify your gaming experience...

HT off and especially C states off, both reduce the DPC latency i measured with latencymon. Not only when the CPU is idle, but also when you turn to a high-performance power profile and run some apps. I believe to have experienced a smoother/snapier gameplay in BF4 with those turned off but i am not really willing to have my system which is running 24/7 consume about extra 100w in idle for that.
Plus, i am not sensitive enough to make a secure statement on if it really was smoother/stutter free. Just like with input lag however, not being able to tell the difference between let's say 20ms input lag and 10ms input lag, does NOT mean that it does not affect your game performance. Given two equally good professional gamers, the one with 10ms will get on average more headshots/kills than the one playing with 10ms extra input lag.

I can see however that someone who would be doing professional audio recording, would want to turn those both off for the least amount of latency.
 
To OP.
I have a similar system to yours, but I have a 390x and my 3570k is at 4.6. I mostly play BF1 and plan on going ryzen shortly. I thought about the 7700k, no one is guaranteed 5.0 Ghz and to me 4 cores just isn't going to be enough especially for BF1. With AM4 you are getting a new platform that will be upgradeable for several years.
 
To OP.
I have a similar system to yours, but I have a 390x and my 3570k is at 4.6. I mostly play BF1 and plan on going ryzen shortly. I thought about the 7700k, no one is guaranteed 5.0 Ghz and to me 4 cores just isn't going to be enough especially for BF1. With AM4 you are getting a new platform that will be upgradeable for several years.

Your fps in BF1 will most likely drop when going from a [email protected] to a ryzen@4ghz. At best, i expect your fps to remain the same. So unless you are planning to use the Ryzen for stuff like working with blender, encoding videos etc, you will be wasting your money.
 
Your fps in BF1 will most likely drop when going from a [email protected] to a ryzen@4ghz. At best, i expect your fps to remain the same. So unless you are planning to use the Ryzen for stuff like working with blender, encoding videos etc, you will be wasting your money.
I get it. but I'm not worried about FPS as much as I am worried about smooth game play. And right now the 3570k chokes a dick on conquest.
 
Here is a video that shows the 3570k@4ghz 1070x@4ghz and the 6700k@4ghz side by side



Note that GTAV is one of the few games which makes uses of more than 4 cores and is quite CPU intensive. Even here, the ryzen is only slightly faster than the 3570k@4ghz. And the [email protected] would be about on par. The ryzens do not really clock much higher than 4ghz from what i have seen in various reviews.

In most games, the 3570k@4ghz beats the ryzen, let alone at 4.6ghz

see here http://www.pcworld.com/article/3176...-or-why-you-should-never-preorder.html?page=2

and here

BF1 is part of that video. The 3570@4ghz is almost on par with the ryzen here. At 4.6ghz it would surpass or be equal to a ryzen clocked to 4ghz.

In the future, when games are more optimized for the Ryzen, the gap might become bigger, but by then, you will have better options to upgrade anyway.
 
I get it. but I'm not worried about FPS as much as I am worried about smooth game play. And right now the 3570k chokes a dick on conquest.

If a 3570k "chokes a dick" on conquest, i doubt the Ryzen won't. Maybe you need to check those temps on your 3570k and dedust/replace your fans if you haven't done so for a long time.
 
If a 3570k "chokes a dick" on conquest, i doubt the Ryzen won't. Maybe you need to check those temps on your 3570k and dedust/replace your fans if you haven't done so for a long time.

Temps are fine (60c). Case is clean. I use Riva OSD, so when I say choking a dick, I mean cores are above 90% utilization. Thanks for your concern. I can't take any review seriously if they are playing BF1 in single player. I have 144hz 1440p freesync. I cap everything at 142 anyway and crank up settings.
 
Temps are fine (60c). Case is clean. I use Riva OSD, so when I say choking a dick, I mean cores are above 90% utilization. Thanks for your concern. I can't take any review seriously if they are playing BF1 in single player. I have 144hz 1440p freesync. I cap everything at 142 anyway and crank up settings.

That has nothing to do with the 3570k struggling at BF1 if the engine is similar to BF4. With HT off on my 5960x playing BF4, it will make use of 4 cores to the max, while the rest of the cores are almost idle. With HT on, it shows less usage but actually gets hotter, so whenever you see reviews with the ryzen showing a low CPU usage because of the 8 additional virtual(not real cores) i would take that with a grain of salt.

Sure, if i was to build a new system, i would rather go with a ryzen than a 3570k but only because the ryzen can also double as a high performance workstation. But for gaming alone, the 3570k is on par or just little below the ryzen in games which make use of 4 cores or more intensively, while in most other games, the [email protected] leaves the ryzen behind. Certainly not worth spending $1k for a motherboard+cpu+ram upgrade on, if you are mostly gaming with your rig.
 
1k? 600 max. All I need is CPU and mobo.

I might be mistaken here. But isn't the Ryzen using ddr4 ram, whereas the 3570k uses ddr3 RAM? Also, will you be going for 1800x or 1700x? Isn't the 1800x $500 alone? A good motherboard is usually around $200, plus the DDR4 RAM $200 or so. So maybe not $1k but around $800-900 or $700-800 if you go for the x1700 instead.
 
You're assuming I don't have any DDR4 RAM already that I didn't pay for. I won some Gskill at a LAN party.
 
You're assuming I don't have any DDR4 RAM already that I didn't pay for. I won some Gskill at a LAN party.

Indeed, when i calculated around $1k, i did not include in my calculations possibly won DDR4 sticks at a LAN party :D

Either way, it will be wasted $600 if gaming is the main task for this CPU, in my opinion. Personally i would wait for the next 10nm CPUs generation and then strike.

edit: Intel's 10nm cannonlake CPUs are supposed to come out before the end of 2017

http://www.kitguru.net/components/c...-confirms-10nm-cannon-lake-still-coming-2017/
 
Zombie, just let them buy it. You aren't going to talk them out of it. I would be interested in seeing some benchmarks though. Because I have several games that appear to use all of my cpu for some reason, (playing a game leads to all cores maxing out), yet I can still decode and stream 5 to 7 streams from the plex server while encoding videos to add to the plex library. I'm not saying anyone is wrong, but let us see where the Ryzen and a 280x or 390x will get them and hopefully it is better than where they are currently at.
 
You might want to watch this video.

Skip to 13:03 if you are only interested in BF 1 between Ryzen and i7-7700K.

But watch the whole thing for context.

 
You might want to watch this video.

Skip to 13:03 if you are only interested in BF 1 between Ryzen and i7-7700K.

But watch the whole thing for context.



Then someone is lying about the benchmarks or so it seems.

Here http://www.gamersnexus.net/hwreview...review-premiere-blender-fps-benchmarks/page-7

both the 6900k and the 7700k beat the ryzen 1800x, even at stock speeds.

However, the benchmarks DO look a bit fishy, considering the 6900k overclocked at 4.4ghz gives worse results when it comes to min fps (given i am interpreting the 0.1% low properly) than at stock speeds. It's also quite obvious that both the 6900k and 7700k are GPU limited when it comes to average fps.

At the bottom he has the results for 1440p as well. Again, the 6900k and 7700k beat the 1800x. Not by much though. What we need to find is a video of a side by side comparison, showing a [email protected] vs a 1800x at 3.9ghz both with a gtx 1080ti. That would be a realistic scenario. Resolution and details should be set, such that the game becomes cpu limited.
 
Anyway, back to the point and the original question OP was asking. If you own a 3570k which is capable of 4.6ghz, then it's not worth it switching to neither a 7700k or Ryzen 1700 or even 1800x when it comes to playing battlefield 1. You will at best see a 15% increase in fps at low resolutions, whereas at high resolutions you will most likely be GPU limited anyway.

If you are building a new system, a Ryzen seems to make more sense than a 7700k, because the gaming performance is almost equal, yet it doubles as a workstation CPU as well.

As a 3570k owner, waiting till the end of 2017 for 10nm cannonlake CPUs would make more sense. Not sure if AMD will have anything new out by then as well.

This is more true for OP who is using a 280x still, being heavily GPU limited. A 1080ti or vega GPU would be the best choice here.
 
As for those claiming that the 7700k CPU is maxing in BF1 in various side by side comparisons with a Ryzen, and assume that a ryzen would deliver much more because they misinterpret the CPU utilization, i found a review which however is in German. You will have to use a translator.

It compares the Ryzen, 6900k and 7700k at low 1280x720 and 1920x1080, where the games become CPU limited. It shows that the Ryzen is behind in almost all games by quite a lot, compared to the 7700k. In games which seemingly make use of more than 4 cores, the Ryzen can win over the 7700k but never wins against the 6900k.

see here https://www.computerbase.de/2017-03.../#diagramm-battlefield-1-dx12-multiplayer-fps

Specifically in BF1 DX12 and 1280x720, the 7700k is far at the top.
 
You will just ignore the fact his CPU is running at 100% just to keep up with the GPU ?
Also that is not entirely true what you are saying in doom2016 OpenGL mode i ran into a CPU bottleneck in 2566x1600. changin to vulkan solved it due to more improved multicore distribution.


@OP
Without price tags it hard to recommend best upgrade.
CPU usage is an inadequate gauge of bottlenecking. If it was, we would see the same frame rate between two entirely different CPUs, but one with lower usage due to efficiency (output over input). What we see in reality is similar CPU usage but with the result we are only concerned with, the product being higher frame rate. Testing CPU bottlenecking is as simple as in the 90s; if the frame rate does not budge at lower resolution compared to higher resolution, the CPU is the bottleneck.
 
I skimmed the thread. Can't directly comment on the bottleneck issue, but experience says that the GPU is the proper upgrade.
I built my friend who is a die hard Battlefield player a similar system a few years back. It's a i5-4690K @ 4.4GHz so just barely faster than yours. Went from a GTX 760 to a 980 Ti, which is now pushing 1440p on ultra settings in BFOne with 60-80fps (100+ fps in BF4).
So a GTX 1060 (6GB) or 1070 will do you wonders.
 
I am also torn between a new build involving the Ryzen 1700 or a high end Intel chip. This would be a Mobo+RAM+CPU combo build, everything else is taken care of. I'm currently running a Frankenstein (virtual machine in ESXi with PCI-Passthru video+sound on an Ivy Bridge Xeon) and I can play recent games at medium settings but of course I'd like to get more performance. I do more than just game, but nothing that would saturate 8c16t, so either chip would work fine.
 
Back
Top