TitanFall 2 CPU rankings.

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,132
Another new release; another reason not to upgrade. Insert game name here; TitanFall 2. The game will not load on a dual core if it doesn't have 4 threads.
http://www.techspot.com/review/1271-titanfall-2-pc-benchmarks/page3.html

Anyway, from what we found, pretty much any CPU supporting four threads will play Titanfall 2 without an issue. For those interested, we found the same surprisingly low CPU utilization in the multiplayer portion of the game as well.


CPU_01.png



Intel performance at various clock speeds.

CPU_02.png



AMD performance at various clock speeds.

CPU_03.png
 
2.5Ghz (or maybe lower) 6700K matches a 4.5Ghz FX 9590. Talk about difference.

But remember its a multiplayer game and like BF, its a different world.
 
2.5Ghz (or maybe lower) 6700K matches a 4.5Ghz FX 9590. Talk about difference.

But remember its a multiplayer game and like BF, its a different world.
And what isn't different is your inability to read the quote in the OP. They had the same results with MP as with SP. And leave it to you to make sure you point out any intel WIN in any skewed way you can. Fact is at stock NO ONE can tell the difference. I would say no one is gonna downclock their CPU to play games but I am sure there is some one some where doing it.
 
I just posted the downclocked stats to show that even people with locked Xeon , i5, and i3 processors can run it at full speed. :)
 
And leave it to you to make sure you point out any intel WIN in any skewed way you can..

Intel has one very important win - Their CPUs don't guzzle 250-300W of power (FX9590) to achieve that performance level. Unless I live at the north pole, I don't need a fusion reactor on my desk cranking out that much heat.

I would say no one is gonna downclock their CPU to play games but I am sure there is some one some where doing it.

Quite a few people underclock the FX9590 because of the excessive TDP overwhelming their cooling systems.
 
Intel has one very important win - Their CPUs don't guzzle 250-300W of power (FX9590) to achieve that performance level. Unless I live at the north pole, I don't need a fusion reactor on my desk cranking out that much heat.



Quite a few people underclock the FX9590 because of the excessive TDP overwhelming their cooling systems.

I thought the FX-9590 was a 220w CPU?
https://www.google.com/search?q=fx-...e..69i57j0.10068j0j8&sourceid=chrome&ie=UTF-8

How does 220w become 300w at stock? I'm not understanding? Are those processors that your friends are running defective? Who would underclock a FX-9590? I can undervolt my FX-9370 and still hit FX-9590 speeds. Is that what you mean?
 
Intel has one very important win - Their CPUs don't guzzle 250-300W of power (FX9590) to achieve that performance level. Unless I live at the north pole, I don't need a fusion reactor on my desk cranking out that much heat.



Quite a few people underclock the FX9590 because of the excessive TDP overwhelming their cooling systems.

:ROFLMAO: not! We can handle the power, take care of the heat and want more from it :rage:
 
Intel has one very important win - Their CPUs don't guzzle 250-300W of power (FX9590) to achieve that performance level. Unless I live at the north pole, I don't need a fusion reactor on my desk cranking out that much heat.



Quite a few people underclock the FX9590 because of the excessive TDP overwhelming their cooling systems.
Who down clocks a 9590 when they could have saved a ton of money and bought the 8350 or 8320e? You just making up stuff to make you seem relevant.
 
And the Core i3 4360 as well as the i5 2500k are also nearly as good as the 6700k. A couple frames off, no one would be able to tell a difference with those as well. So the game is GPU Limited it seems.

I would expect a FX 9590 to achieve such at 1080p. Hell, a FX 4320 is right up there as well.

So what's the big deal getting the same fps @ 1080p? I have ZERO problems running any game I own at 1080p with my Phenom II 1055t. The GTX 680 is the bottleneck.
 
Last edited:
I get tired of all this division and argument over a hobby that is supposed to be fun.

Most of you people need counseling.

People wear their corporations on their sleeves. I can see fighting for them if you got paid but come on.. most don't.
 
Is it a bad thing when 4 year old processors can still max out a brand new AAA game?:confused:
Not really.

What is a bad is when a 4 year old flagship CPU with ridiculous motherboard and cooling requirements is only keeping up with entry level CPU that can work in a literal potato.
 
The game runs well is the only take I see on all of this, the rest is just hyper bole. These forums have gotten more toxic over the years, which is mind boggling. Anything positive and someone has to rush in and be negative and it's even worse in the gpu section. No wonder it seems forum posting at this site seems way down anymore, a shame was always a good place to have a discussion on tech.
 
Not really.

What is a bad is when a 4 year old flagship CPU with ridiculous motherboard and cooling requirements is only keeping up with entry level CPU that can work in a literal potato.
Yet when "keeping up" is literally maxing out the GPU power available I really can't see how this is a negative. People with a 2500k (almost 5 years old!) are seeing this benchmark and realizing that in certain usage scenarios they have absolutely no reason to upgrade.
 
All i see is that it takes a 9590 to match an i3-6100.
not quite and I love how so many of you are quick to jump on the single graph-train. There is always more to the story than these graphs like frame times and how stable the frame-times are. Way too busy looking at avg min max rather than consider the quality of the gameplay. Like so many of the DX12 haters that look at a graph and claim DX11 is better when in reality the gameplay on DX12 is far more consistent (not making any claim against all games just a simple point).
 
not quite and I love how so many of you are quick to jump on the single graph-train. There is always more to the story than these graphs like frame times and how stable the frame-times are. Way too busy looking at avg min max rather than consider the quality of the gameplay. Like so many of the DX12 haters that look at a graph and claim DX11 is better when in reality the gameplay on DX12 is far more consistent (not making any claim against all games just a simple point).
1. Frame times are irrelevant when the averages are inferior as they are with every non-Fx9590 AMD cpu in this test. Let alone absolute minimums.
2. You mean the BF1, right? Because we all know the disaster it is *on stability of frame-rates alone*.

Yet when "keeping up" is literally maxing out the GPU power available I really can't see how this is a negative. People with a 2500k (almost 5 years old!) are seeing this benchmark and realizing that in certain usage scenarios they have absolutely no reason to upgrade.
Because anything slower fails to do so. Sure, Sandy owners are in their usual mood, but this ain't no Intel forums.
 
Intel has one very important win - Their CPUs don't guzzle 250-300W of power (FX9590) to achieve that performance level.
This is such a moot point : no gamer will ever choose a cpu on the consideration for power efficiency. And honestly, in between 80FPS and 140FPS you can't tell the difference with a 60fps TV as your display. I am grateful hat the new crop of games favors the quad cores we'll just slap in a cheap AMD quad-core and call it a day!
 
1. Frame times are irrelevant when the averages are inferior as they are with every non-Fx9590 AMD cpu in this test. Let alone absolute minimums.
2. You mean the BF1, right? Because we all know the disaster it is *on stability of frame-rates alone*.


Because anything slower fails to do so. Sure, Sandy owners are in their usual mood, but this ain't no Intel forums.
See this is why we cant take you seriously.

LOOK at the graph in this thread again.

NO LOOK.

Now look at what you posted...

See the issue here? The non 9590s have minimums just 3fps shy of the top and just 4fps shy of the avg... So you sure you talking about this graph on this page?

At any rate what you said no matter the thread is asinine. More than enough user feedback on 2 cores even the HT ones to ascertain that no matter the results even on this page garner it a win over AMDs 5 yr old CPUs.

Not sure what exactly you are trying to peddle here.
 
See this is why we cant take you seriously.
Just like i can't take some of you seriously when you start talking about AMD.

See the issue here? The non 9590s have minimums just 3fps shy of the top and just 4fps shy of the avg... So you sure you talking about this graph on this page?
Yes, it is less than i3-6100. Not in experience-affecting way, true. But less than i3-6100 nonetheless.

Not sure what exactly you are trying to peddle here.
Just stating what is written on graph, nothing less.
 
lol on this thread. Basically this game funs well on virtually all last 5 year old cpu's. Looks like the game is also thread limited or maybe just GPU limited. Anyone with a 60hz monitor will see no (zero) benefit with virtually any faster cpu in this game is the bottom line. Fastsync might be usable in this game as a afterthought which could reduce latency some for those 60hz monitor users.

Another note - The FX 9590 is rated at 220w, you might pull that at 100% cpu usage, I doubt this game will remotely keep the FX 9590 at 100% usage. Meaning it will be much more like 125w if that. Some here really show a lack of understanding.
 
lol on this thread. Basically this game funs well on virtually all last 5 year old cpu's. Looks like the game is also thread limited or maybe just GPU limited. Anyone with a 60hz monitor will see no (zero) benefit with virtually any faster cpu in this game is the bottom line. Fastsync might be usable in this game as a afterthought which could reduce latency some for those 60hz monitor users.

Another note - The FX 9590 is rated at 220w, you might pull that at 100% cpu usage, I doubt this game will remotely keep the FX 9590 at 100% usage. Meaning it will be much more like 125w if that. Some here really show a lack of understanding.
Trust me I know. My [email protected] and 290@1100 pull between 250-350W together at the wall during gaming. Unfortunately most of the anti-AMD brigade has never even touched a FX and has no idea of how they operate on any given day.
 
Anyway, from what we found, pretty much any CPU supporting four threads will play Titanfall 2 without an issue. For those interested, we found the same surprisingly low CPU utilization in the multiplayer portion of the game as well.

Found this out with another game. Trashed a tri-core from a PC that someone gave me, and grabbed a used quad core from eBay on the cheap, lol. So lame that game devs do this. There was a patch to remove the check, but it was wrapped with removing drm check also, so it got taken down everywhere.
 
Found this out with another game. Trashed a tri-core from a PC that someone gave me, and grabbed a used quad core from eBay on the cheap, lol. So lame that game devs do this. There was a patch to remove the check, but it was wrapped with removing drm check also, so it got taken down everywhere.
Kinda two-fold. On one hand it is great having everyone be able to play. On the other with the existence of >4 core CPUs 2cores are only holding us back. I am not one to balk at someone trying to save money or those that actually lack it so I hate preferring the move to 4 core minimums but I think we are at a point where we need to get to that next step.
 
I thought the FX-9590 was a 220w CPU?
https://www.google.com/search?q=fx-...e..69i57j0.10068j0j8&sourceid=chrome&ie=UTF-8

How does 220w become 300w at stock? I'm not understanding? Are those processors that your friends are running defective? Who would underclock a FX-9590? I can undervolt my FX-9370 and still hit FX-9590 speeds. Is that what you mean?

Thermal Design Power is not the maximum the processor can draw, it's measured by a nominal load. CPUs often exceed their TDP ratings in certain types of loads, especially balls to the wall loads. The 220W figure listed by AMD is rather conservative, benchmarks have shown it to pull up to 300W even at stock clocks.

Another example would be an i5-2400, which has a 95W TDP. Under peak loads it can pull 150W, or ~58% more power than the TDP specifies.

This is such a moot point : no gamer will ever choose a cpu on the consideration for power efficiency.

And you'd be wrong there. If you live in a hot climate where electricity is expensive, you have the choice of sweating your ass off in a 90-100F room or have the A/C cranking away all day giving you $350 power bills. Or alternatively you can pick components that won't crank out tons of heat and game in a nice cool room without the $350 power bill.
 
I had one FX build with a HD 7950 and a 500w Corsair power supply. According to you it should have spontaneously combusted.

I had a FX-9370 build with (2) HD 7950, GTX 670 for PhsyX, (4) drives in raid, (3) SSD, (2) water pumps, (10) fans because they were free from Tiger Direct, 850w power supply, (2) monitors. Everything OC'd.

I live in a hot environment where the temps exceed 100F in the summer. We run the A/C 24/7. The A/C is still on here in North Carolina because it's still hitting 80F in the past week. My A/C is set to 70F and sometimes 67F. Probably won't turn it off until January. Every light in my house is on 24/7 because I am a klutz and my mom is legally blind.

I have never seen a $350 electricity bill in my 40+ years of existence on this planet. I've never seen a $300 electricity bill. How much more stuff do I need to add to my PC to get a $350 electricity bill?
 
I live in a hot environment where the temps exceed 100F in the summer. We run the A/C 24/7. The A/C is still on here in North Carolina because it's still hitting 80F in the past week. My A/C is set to 70F and sometimes 67F. Probably won't turn it off until January. Every light in my house is on 24/7 because I am a klutz and my mom is legally blind.

I have never seen a $350 electricity bill in my 40+ years of existence on this planet. I've never seen a $300 electricity bill. How much more stuff do I need to add to my PC to get a $350 electricity bill?

And I'm going to call bullshit on that. Either you've never paid an electrical bill in your life or you're grossly exaggerating your A/C usage.

Let's do some real maths and calculate what your real power bill should be if everything you said is true. The average HVAC system for a whole house pulls 3-5 kWh. We'll split the difference and go for 4 kWh, running that 24/7 for an entire month and factoring in the average kWh cost for NC (12 cents per kWh), the HVAC alone would be $350.44. And just for kicks, let's add in your game rig assuming it's running balls to the wall 24/7 (you know since you may be mining coins or seti@home while you're away), that's an additional $74.47 for a grand total of $424.91. And this doesn't factor in your lights, stove, water heater, etc. You should easily be seeing $500+ bills assuming the latter two aren't gas.

As for your claim that temperatures exceed 100F in the summer, Let's look at historical data for NC:

https://weatherspark.com/history/29589/2016/Wadesboro-North-Carolina-United-States

After a random sampling of 10 cities, this is the only one that I could find that exceeds 100F for a grand total of one day in 2016. The 10 other cities that were sampled were between 85-95F.
 
And I'm going to call bullshit on that. Either you've never paid an electrical bill in your life or you're grossly exaggerating your A/C usage.

Let's do some real maths and calculate what your real power bill should be if everything you said is true. The average HVAC system for a whole house pulls 3-5 kWh. We'll split the difference and go for 4 kWh, running that 24/7 for an entire month and factoring in the average kWh cost for NC (12 cents per kWh), the HVAC alone would be $350.44. And just for kicks, let's add in your game rig assuming it's running balls to the wall 24/7 (you know since you may be mining coins or seti@home while you're away), that's an additional $74.47 for a grand total of $424.91. And this doesn't factor in your lights, stove, water heater, etc. You should easily be seeing $500+ bills assuming the latter two aren't gas.

As for your claim that temperatures exceed 100F in the summer, Let's look at historical data for NC:

https://weatherspark.com/history/29589/2016/Wadesboro-North-Carolina-United-States

After a random sampling of 10 cities, this is the only one that I could find that exceeds 100F for a grand total of one day in 2016. The 10 other cities that were sampled were between 85-95F.

I am going to claim someone is lying because it does not fit into my preconceived notions of what I think I know. Here because math...... :D Good luck with your high electric bills, mine usually averages about $42 dollars a month over a 12 month stretch. Want to call BS on that as well? Oh, and I run an FX 8300 at 4.5 GHz and now I run 2 x Sapphire Fury's.
 
I am going to claim someone is lying because it does not fit into my preconceived notions of what I think I know. Here because math...... :D Good luck with your high electric bills, mine usually averages about $42 dollars a month over a 12 month stretch. Want to call BS on that as well? Oh, and I run an FX 8300 at 4.5 GHz and now I run 2 x Sapphire Fury's.

~5000Kw/h a year at 10 cents? The US average is 10800Kw/h. You would be quite an "eco friendly" American if that's actually true.

https://www.eia.gov/tools/faqs/faq.cfm?id=97&t=3
 
This is such a moot point : no gamer will ever choose a cpu on the consideration for power efficiency. And honestly, in between 80FPS and 140FPS you can't tell the difference with a 60fps TV as your display. I am grateful hat the new crop of games favors the quad cores we'll just slap in a cheap AMD quad-core and call it a day!

Yeah, except if you're trying to build a SFF machine and you're really restricted on the type of CPU and the cooling solution you can use on it.
 
This is such a moot point : no gamer will ever choose a cpu on the consideration for power efficiency. And honestly, in between 80FPS and 140FPS you can't tell the difference with a 60fps TV as your display. I am grateful hat the new crop of games favors the quad cores we'll just slap in a cheap AMD quad-core and call it a day!

Not true - I loved what I was seeing with Haswell as far as efficiency and heat output was concerned. One of the bigger selling points for me getting it and I've not been disappointed at all. And the most strenuous task I do on my PC is gaming. I game all the time.
 
And I'm going to call bullshit on that. Either you've never paid an electrical bill in your life or you're grossly exaggerating your A/C usage.

Let's do some real maths and calculate what your real power bill should be if everything you said is true. The average HVAC system for a whole house pulls 3-5 kWh. We'll split the difference and go for 4 kWh, running that 24/7 for an entire month and factoring in the average kWh cost for NC (12 cents per kWh), the HVAC alone would be $350.44. And just for kicks, let's add in your game rig assuming it's running balls to the wall 24/7 (you know since you may be mining coins or seti@home while you're away), that's an additional $74.47 for a grand total of $424.91. And this doesn't factor in your lights, stove, water heater, etc. You should easily be seeing $500+ bills assuming the latter two aren't gas.

As for your claim that temperatures exceed 100F in the summer, Let's look at historical data for NC:

https://weatherspark.com/history/29589/2016/Wadesboro-North-Carolina-United-States

After a random sampling of 10 cities, this is the only one that I could find that exceeds 100F for a grand total of one day in 2016. The 10 other cities that were sampled were between 85-95F.

Don't know where you live. Maybe you should Google a little better. Seems that my area has hit 105 F during summers multiple times. If it hits 105 F five or more times historically; guess what? IT CAN HIT 100 F TOO! Northampton County, NC
http://climate.ncsu.edu/climate/nc_extremes.php

I pulled 1750 kWh for $240 bill once. All lights, PC, A/C; everything on 24/7. Maybe you should invest in LED bulb technology where you live?

What kind of electricity do you buy? Gold bullion electricity? Maybe you should move to my area! We have an electric CO-OP if that matters. :)
 
The guy with the A/C on all the time, it probably averages maybe 8 hours a day if outside is hitting 80F, not 24 hours a day, lol.
 
Back
Top