Another,' is Crossfire worth it?' thread - 280x

atom

Gawd
Joined
May 3, 2005
Messages
858
Rise of the Tomb Raider is really bad no matter the settings when I am outside of a Tomb or other inside area. The Witcher 3 could use some work even on medium/low. Fallout 4 has a few settings turned down which thankfully just means less dead shrubbery. It's only getting worse from here. I'm not throwing down 500 bucks for a new card in 2016, I just bought a new house and it is not in the budget.

Okay, so here is the deal. I can get a second 280x or 7970 for less than 150 bucks. I am looking at the few benchmarks I can find and the numbers aren't too impressive. But, there is a lack of data. Do any of you guys have experience doing Crossfire with these GCN 1.0 cards? Is it worth it to get me through 2016? What do you think about 3GB? To make matters worse, due to my crappy motherboard, the Crossfire will be running in 4x mode. I am embarrassed to admit this to you guys. Will accept hardware donations, lol.

1080p
 
Last edited:
sounds like staying put till finances warrant a new card, not CF. Say a 290/X would fit the bill but they require ~$250. I was in the same boat, bought a house, but I made my 290 purchase knowing it would be my last for a while. Of course I was in worse shape, had 7770 X2 in CF, which was still under your single 280X. Getting the 290 was so very worth it.
 
I paid $150 for a second HD 7950 just before the 290 launch. It worked quite well until Watch Dogs came out. Quite well means only major releases of course. It was a deadbeat in my PC for all games that didn't support Crossfire. But in general those game were all games that didn't need it from the get go.

Watch Dogs was a game that I played until I beat it 100% and the DLC. Towards the end of the game I bought a R9 290. It cured all of my Crossfire stuttering and the 4GB of VRAM made my Watch Dogs experience so much better. Of course Watch Dogs coding techniques were the culprit and not AMD, but in the end it's all about the end user experiences. I'd say go for it as long as the second card is dirt cheap. If the second card is priced anywhere close to the R9 290 / R9 290X series or better, I would grab a used one of those instead.
 
  • Like
Reactions: atom
like this
i wouldn't suggest CF either due to limitations of your board and possibly psu(needs ~800W). is it good enough? you didn't mention any system specs...
what about oc'ing to get a little more life out of everything? my cpu and gpu(280x) are oc'd pretty good and I have had no issue with anything at 1080p on high or better. I haven't tried FO4 or ROTR but TW3 ran on high at 45-50 FPS. payable but I couldn't get into the game... GTA5 gets 45-60 on mix of high/very high, depending on location. Cryengine based games get 50-60 on high. UE4 games get 50-60.... yes some game may be limited by the 3GB vram but its pretty easy to mix setting for good results and at 1080p I have never noticed a game having vram issues on my system.
 
If you can find benchmarks for your card with a close enough hardware setup you can verify that it is your computer or not. If it is not just clean your computer of crap and try again ..
If you have to shell out money check Polaris :) in a few months ;)
 
Sell your card and buy a used aftermarket cooled 290X you will be better
 
I'm running 2x 280x cards for a few years now and at 1080 it's smooth sailing. Usually get between 85% to 100% scaling (yes 100% scaling on games like Armored Warfare).

At 4x probably take a hit. I haven't hit the 3GB limit often at 1080 (1440 is another story) So... It's a viable option, I plan on using this setup until Polaris becomes available. Given the option for you I'd sell it and go for a 290x/380x IMHO.
 
Thanks for the replies guys. My power supply should be good, I originally built the PC with a future Crossfire setup in mind. Unfortunately I built this after a 10 year separation from PC gaming and thought that a mobo that advertised it was "Crossfire Ready" meant that it was as good as it gets for that purpose.
I'm running 2x 280x cards for a few years now and at 1080 it's smooth sailing. Usually get between 85% to 100% scaling (yes 100% scaling on games like Armored Warfare).

At 4x probably take a hit. I haven't hit the 3GB limit often at 1080 (1440 is another story) So... It's a viable option, I plan on using this setup until Polaris becomes available. Given the option for you I'd sell it and go for a 290x/380x IMHO.
Do you play Rise of the Tomb Raider?
 
Do you play Rise of the Tomb Raider?

Was planning to, same setup plays the last TR game perfectly. But got into AOTS & Offworld Trading Company and they are taking all of my free time.
 
Thanks to everyone who replied. I looked at some performance details through MSI Afterburner while playing Tomb Raider today and I actually think my problem has more to do with the CPU than anything. I have an FX 8350 and apparently even with all 8 cores going its struggling with this game. Also, I am a Steam In-Home Stream user, so when the CPU usage goes high, the encoding drops out. I will try to do some overclocking and still consider the Crossfire option.
 
In the Steam Beta at least, there is an option to force the video card encoder for Steam in home streaming. Try that. Also try getting the CPU to at least 4.0GHz. They don't come alive until then. If you have a proper motherboard you can take it much higher, but after 4.7GHz there is diminishing returns.
 
In the Steam Beta at least, there is an option to force the video card encoder for Steam in home streaming. Try that. Also try getting the CPU to at least 4.0GHz. They don't come alive until then. If you have a proper motherboard you can take it much higher, but after 4.7GHz there is diminishing returns.
Yup that's what I use. The fact that the Steam streaming is a separate process is why it goes out when the game peaks the CPU. That's my theory anyways. 'SLOW ENCODE' is my enemy. I usually hit pause and wait for it to go away and the FPS to slowly go back up to 59 before I continue. Open areas like Soviet Installation are pretty bad but its really the enemy AI that makes me drop the stream.

I will try to push my CPU to 4.0. My mobo doesn't have VRM heatsinks so anything above that is doomed.
 
Can you try and free up 2 cores for your streaming program and see if TR still working fine on 6 cores ? I think you can manually assign the program to use the free cores as well (might even work with just one core).
 
That is a good idea, and I had already considered it, but dropped the idea when I realized dropping 20% of my CPU on a game that is using 75%-100%. What the heck I'll try it tonight and see what happens.
 
On the side of my case where the motherboard is mounted, there is a spot for a 120mm fan. I stuck a cheap $0.50 fan on it that blows directly onto the backside of the CPU socket. Makes cooling essential things very easy. ;)
 
but doesn't the 8350 start out at 4.0Ghz with 4.2Ghz turbo? I mean mine did. I found that >=4.7Ghz makes a world of difference in gaming.

And a fan on the back of the socket is a must for sure.
 
but doesn't the 8350 start out at 4.0Ghz with 4.2Ghz turbo? I mean mine did. I found that >=4.7Ghz makes a world of difference in gaming.

And a fan on the back of the socket is a must for sure.
That is right, excuse me for posts in the wee hours of the morning.
 
On the side of my case where the motherboard is mounted, there is a spot for a 120mm fan. I stuck a cheap $0.50 fan on it that blows directly onto the backside of the CPU socket. Makes cooling essential things very easy. ;)
fan on the back side of the motherboard? i havent seen that done before.....let me see a piccture..not only that does the backside really have any heat? lol if i had a fan on back of mine it be blowing a bunch of wires and thats it
 
That is a good idea, and I had already considered it, but dropped the idea when I realized dropping 20% of my CPU on a game that is using 75%-100%. What the heck I'll try it tonight and see what happens.
I honestly cant remember ever playing a game on this rig that used more than 50%...usually its more around 10-30.....that game really using that much on a 8 core?
 
I honestly cant remember ever playing a game on this rig that used more than 50%...usually its more around 10-30.....that game really using that much on a 8 core?
Yup all 8 cores are being utilized quite well in this game. Like I said It's pretty good until I get into a battle with 5 or 6 dudes. The Division Beta uses all 8 cores at 100%. I didn't get around to experimenting today guys.
 
fan on the back side of the motherboard? i havent seen that done before.....let me see a piccture..not only that does the backside really have any heat? lol if i had a fan on back of mine it be blowing a bunch of wires and thats it
These 8350s can tax the socket quite a bit. A fan on the back helps a lot with stability at higher clocks and with some MoBo that throttle at the drop of a hat. A lot of case today have a fan spot for one or do like I did and make a hole.
 
Yup all 8 cores are being utilized quite well in this game. Like I said It's pretty good until I get into a battle with 5 or 6 dudes. The Division Beta uses all 8 cores at 100%. I didn't get around to experimenting today guys.
Take your time OCing and be careful. Voltage isn't going to be your concern, TEMPATURE WILL BE. Max temp for the 8350 is 72C but most will tell you 62C because that is the safe zone (minimal chance of frying the chip at that point). If you are air cooling 4.4Ghz to 4.6Ghz will be your max. Water can get you from 4.6-5.4Ghz depending on how big the rad is to how good the loop is. 240 rads seem to allow 5.0 Ghz most of the time.
 
I can't even do 4.2 stable on cores 7 and 8. Everything else will go pretty high. Not going to bother with it. Thanks for the advice though. My 8350 is water cooled.

I had some good success today by simply setting Steam.exe to "High Priority" in the task manager. It really was as simple as that. I still hit 'Slow Encode' a few times, but the frequency was greatly decreased. I need to do more testing though. I may make another thread to highlight this experiment in the future.
 
I can't even do 4.2 stable on cores 7 and 8. Everything else will go pretty high. Not going to bother with it. Thanks for the advice though. My 8350 is water cooled.

I had some good success today by simply setting Steam.exe to "High Priority" in the task manager. It really was as simple as that. I still hit 'Slow Encode' a few times, but the frequency was greatly decreased. I need to do more testing though. I may make another thread to highlight this experiment in the future.
Yeah good idea on another thread since it is CPU related. Just give your MoBo and ram specs too. Ill help in any way I can.
 
I have crossfire 280Xs with my setup, which has been treating me well at 2560x1440p. The most recent release I have is COD Advanced Warfare 3 and I easily get 100+ FPS average. I'd be willing to run benchmarks, but don't own Fallout 4, Tomb Raider, The Division, or many other recent big title releases. Also, the differences in our processors probably wouldn't give much of a comparison especially given the recent issues you were looking into.

Edit: Mix of medium / high gfx settings on COD.
 
Last edited:
No way, the biggest hardware hassle I've ever encountered was using my HD 6950s in CFX. Not only did many/most of my games at the time not even scale with CFX, but I had frustrating driver issues. Basically my cards would flip to 2D core and memory speeds when I loaded 3D applications. So I had to manually create profiles to set the clock/mem and fan speeds, then used shortcut keys to trigger the various profiles depending on my task (gaming, browsing, etc.). Never again.
 
I'll just add that 2x 280x cards pass Valves VR test as VR Ready (in the green).
 
No way, the biggest hardware hassle I've ever encountered was using my HD 6950s in CFX. Not only did many/most of my games at the time not even scale with CFX, but I had frustrating driver issues. Basically my cards would flip to 2D core and memory speeds when I loaded 3D applications. So I had to manually create profiles to set the clock/mem and fan speeds, then used shortcut keys to trigger the various profiles depending on my task (gaming, browsing, etc.). Never again.

Crossfire/SLI was terribad back in the day. My only experience with crossfire has been my watercooled 290x's and a 290x/290 combo for a few months. It's worked fine for me, you just have to download the latest beta drivers whenever new games come out.

The real issue is that unless you're pushing GPU-heavy AAA games on a 4k/eyefinity/VR setup you really don't need it. Your inefficient desktop ends up feeling like a really expensive space heater most of the time.
 
crossfire is not worth the money, because it doesn't work with all the games out of the box, like it should, and sometimes when it does work, there is not enough gain to justify the costs,
also the ram is halfed, so even though you have 8gb total on your cards, you can only use 4gb effectively, what a waste
 
crossfire is not worth the money, because it doesn't work with all the games out of the box, like it should, and sometimes when it does work, there is not enough gain to justify the costs,
also the ram is halfed, so even though you have 8gb total on your cards, you can only use 4gb effectively, what a waste

At least with Direct X 12 your VRAM gets turned into a total pool and isn't divided. So you will see gains as more Direct X 12 games come out. I think Crossfire is a good feature for cheaper solutions. But I think AMD sucks at releasing Crossfire profiles in a timely fashion. Nvidia has them in terms of that.
 
At least with Direct X 12 your VRAM gets turned into a total pool and isn't divided. So you will see gains as more Direct X 12 games come out.

I'm pretty sure this is a falsehood.

The reason why RAM is 'duplicated' across cards is simply because both cards are rendering the same thing, so naturally they'll need to utilise the same assets, however transferring memory across the PCI-E bus is much slower than the GDDR5/HBM bus, so in fact running with a 'pooled' RAM resource across all cards would actually be slower than if each card ran the same duplicated data. If future cards introduce a new CFX adapter cable that directly connects the RAM pools, then that would be different: but DX12 is NOT going to make your 2x 3GB cards run as 1x 6GB card, and even if it did, it would be SLOWER than if it ran with duplicated assets.
 
I would not do CF, especially at 4x.

The only thing I might do is sell the card, maybe find something else to sell and get a 290x/390x/970. Or wait for the new cards/when you have more cash.

I had 3x 7970 and went to a single 780ti (basically 970/290x/390x) and gamed at the same settings.
 
Back
Top