Got an i5-2500? Don't bother upgrading to a 4790

So he disables one of the most important difference between the i5 and the i7 namely 4 threads vs 8 threads, and then he showcases how little change there is ? Mostly in 10+ year old games?
 
@M76 - Maybe he meant 4690? ;)

But he's a bit off. The biggest differences between the different sockets are native support for faster memory, PCIe 3.0, and of course a slightly higher native clock rate. You won't see these differences just looking at FPS.
 
Biggest differences are power consumption and PCI-E 3.0. Higher memory clocks doesn't matter much, and overclocking on Haswell is lower than SB. Emulators like Dolphin greatly benefit from the Haswell architecture. AVX2 and FMA3 might matter to some people. IPC is higher, which usually negates the overclock advantage.

But I would argue for 90+% of people with 1366 systems, there is very little reason to upgrade. Let alone a more modern 1155 system.
 
How does Haswell specifically help Dolphin?

(I just found Dolphin, and it's really awesome. I'm able to crank up AA and the internal rendering resolution and output to 1080 and still run 60fps in almost all games. I couldn't believe how nice everything looked after playing at 480 for so long. Sad they did not put better hardware into the Wii, the games obviously would have benefitted)
 
Only serious sam 3 would show any benefit for anyone not using a 120hz monitor.

Very interesting. I would like to know what clock speeds he ran the two processors at. Also would like to see some newer games, but the inclusion of crysis 3, tomb raider 2013, bioshock infinite, and far cry 3 are great. Couldn't imagine the results being very different for many other AAA games with the exception of battlefield.

Was starting to get the itch to upgrade my 2500k, but instead pushed my OC from 4.4 to 4.6ghz.

Other interesting thing, it that these results are all at 2560x1600. Good for me since I game at 1440p.

EDIT: For all the lazy people:


Chart.png


2600k%20v%204770k.png


3970X%20v%204770k.png
 
Last edited:
more than the lack of reasons to upgrade i am curious about how serious sam and to a lesser degree stalker 3 managed better results with haswell.:confused:
 
I'd like to see a comparison done on a cross fire system with 2600k and 4790k @ same clocks but with 4790k running 2400mhz memory, seems like crossfire/sli benefits quite a bit from more memory bandwidth.
 
Wow ..... this is pretty awesome. I have had my 2500k @ 4.8 for 3 years now. I actually was ready to build an entire new rig. Guess I don't have to anymore, just a case upgrade for me. I will be encoding video but most of the time will toss a few on to encode and go to bed.
 
I think the thread title should be changed to "Why not to upgrade your CPU when you primarily software that is not CPU bound"
 
wonder if things will be different when using 2 GPUs

I want to get a new 3440x1440 monitor and I wasn't happy with the benefit of adding another 970 in my 2500k system so I wondered if my cpu is holding me back to get better performance out of the SLI setup

my post from the 3440x1440 thread:
tried the SLI 970 today and wasn't happy with the results at 1920x1200
still have my first 65% asic msi 970 which I need to return tomorrow - I wasn't happy with it and got another one with 75% which clocks a lot better so I am keeping that one

anyway the results I got were really disappointing so it doesn't make sense to have 2 with my cpu, didn't think the bottleneck would be that big with i5-2500K @ 4.4GHz (clocked down a bit due to some stability issues lately)

unigine valley
stock (1306/1750) - single 3100 sli 4500
oced (1400/1800) - single 3200 sli 4550
oced (1500/1900) - single 3400

in SLI the upper card gets 10 hotter and gets to 75 at 1600rpm (only 1 free slot between them)

in Metro LL benchmark I got single 75 sli 89
in Watchdogs the SLI is about constant 10fps worse with a lot bigger fps drops compared to single

OCing doesn't seem to make much difference in SLI since I am already cpu bottlenecked with stock clocks in sli

couldn't test 3dmark, getting some dx error after first test IDXGISwapChain::SetFullscreenState failed on both 344.48 and 344.60
so much for the nvidia "better drivers" - been an ati guy for the last couple of years (2900xt, 4870, 6950/6970, r9 290 to name the most recent), my last nvidia was 16MB Riva TNT ;o]
(actually I have a passive 7300GT is my non gaming rig and also have passive Asus DirectCU 640GT that I bough as a backup/hybrid physx card which is awesome since it has 95% asic and will not go over 70 degrees bench while passive and stock TIM)

don't want to spend another 2k on cpu upgrade, the monitor with stand and second card are already almost 2k and I want to upgrade my audio setup which will be another 3k so it seems I am sticking with 1920x1200 for a while ;o[

few month ago I got a "used" never out of the box 24" IPS HP ZR24w with 0 hours use according to osd for 150usd to replace my old 1920x1200 24" TN so I don't really need a new LCD but it was finally a panel that I was interested in and I really wanted it but don't want to cough up 4k for the LCD upgrade as it will require major upgrade


or is it normal to only get that little performance increase from second GPU? (never had a sli/cf setup before)
I only got 15-45% better framerates compared to single GPU and in some games it was even worse then with single gpu


.
 
Sorry, but I don't buy it... Sure for older games, you may not need to upgrade, but there is definitely a noticeable difference on newer titles and even regular old office type software on a newer system.
 
wonder if things will be different when using 2 GPUs

I want to get a new 3440x1440 monitor and I wasn't happy with the benefit of adding another 970 in my 2500k system so I wondered if my cpu is holding me back to get better performance out of the SLI setup

my post from the 3440x1440 thread:



or is it normal to only get that little performance increase from second GPU? (never had a sli/cf setup before)
I only got 15-45% better framerates compared to single GPU and in some games it was even worse then with single gpu


.

It all comes down to the game and nVidia's drivers. SLI has relatively poor scaling compared to Crossfire, at least XDMA Crossfire, and the small dies never really scaled well in the first place (Gxxx4 dies). That said, if the games have proper SLI profiles, you should be seeing about 60-80% gains. Are you sure the games aren't automatically increasing your quality?

Sorry, but I don't buy it... Sure for older games, you may not need to upgrade, but there is definitely a noticeable difference on newer titles and even regular old office type software on a newer system.

Depends on the game, resolution, and how the computer is used. Many new games do not benefit. Then again, there are many that do. Depends entirely on the game. Most are GPU bound until tri-SLI/crossfire, at which point they become CPU bound.
 
Back
Top