Best Deal for 1920x1200 Gamers: 6850 or 6950 (flashed)?

SEALTeamSix

2[H]4U
Joined
Aug 15, 2003
Messages
2,434
The consensus seems to be that these are the 2 best deals in the AMD camp at this time. Yet, none of the reviews I've seen feature them both (which I realize is due to them occupying different market segments). How much better is a 6950 (that has been flashed into a 6970) compared to a 6850? Is the extra performance worth the extra cost to a 1920x1200 (or 1900x1080) gamer?
 
Then you are not looking at the right reviews. Many other review sites have compared them. Read the techpowerup, computerbase.de reviews. The 6950 at stock is ~40% faster than the 6850 and the 6970 is >50% faster. So an unlocked 6950 will fall between those percentages depending on what clocks you are able to achieve.
 
The consensus seems to be that these are the 2 best deals in the AMD camp at this time. Yet, none of the reviews I've seen feature them both (which I realize is due to them occupying different market segments). How much better is a 6950 (that has been flashed into a 6970) compared to a 6850? Is the extra performance worth the extra cost to a 1920x1200 (or 1900x1080) gamer?
Assuming $189.99 for the 6850, this places the 6970 value (per performance) at $275.
This means, even a 6950 ($299) flashed and clocked to 6970 speeds, still does not match the value; overpriced by $24.

To answer your question directly, the extra performance is NOT worth the cost.
But the real question is, how badly do you want the extra performance? :p
 
The 6850 is $179.99 though, so that makes the 6950 worth more like $230. It's way overpriced, but we knew that because the faster cards get, the more you pay.
On the basis of a GTX460 1GB being $189.99 a GTX570 is only worth $275 and a GTX580 only $325, yet plenty of people still buy them.
 
The 6850 is $179.99 though, so that makes the 6950 worth more like $230. It's way overpriced, but we knew that because the faster cards get, the more you pay.
On the basis of a GTX460 1GB being $189.99 a GTX570 is only worth $275 and a GTX580 only $325, yet plenty of people still buy them.
More details:
Assume $179.99 for the 6850.
Assume $299.99 for the 6950.

This means the 6950 (flashed OR not flashed) must be 66.670% faster than the 6850 to justify the price.
Does that ever occur?
 
Assuming $189.99 for the 6850, this places the 6970 value (per performance) at $275.
This means, even a 6950 ($299) flashed and clocked to 6970 speeds, still does not match the value; overpriced by $24.

To answer your question directly, the extra performance is NOT worth the cost.
But the real question is, how badly do you want the extra performance? :p

I believe the answer to this question is another question...What card will you be upgrading from? The extra cost might be worth it, otherwise you won't notice much of an improvement.
 
Nah, I'd say 30% faster more like.

That's if you don't use AA/AF, but you don't buy a $300 ($280 after rebate) video card with 2GB of memory to run it without AA/AF.
image3ui.jpg



More details:
Assume $179.99 for the 6850.
Assume $299.99 for the 6950.

This means the 6950 (flashed OR not flashed) must be 66.670% faster than the 6850 to justify the price.
Does that ever occur?

Everyone should already know that high end parts never have and never will justify the price premium. The choice must be made based on the prices of what is available in the market in the same performance bracket. A $280 (after rebate) card with 2GB of memory that, after being unlocked, performs as good as a $380 card is not a bad upgrade.
 
Lorien: I'd like to see the rest of those tests just in case. I find it surprising that AA will make that much of a difference, perhaps it does.
 
I believe the answer to this question is another question...What card will you be upgrading from? The extra cost might be worth it, otherwise you won't notice much of an improvement.
I'll be coming from an HD4850, and I am also upgrading to an OC'ed Sandy Bridge system (2500K + UD3) from a OC'ed C2Q system (Q6600 + P6N), so the CPU won't be a potential bottleneck.

UtopiA said:
But the real question is, how badly do you want the extra performance? :p
Now that is a good question! :D
 
Lorien: I'd like to see the rest of those tests just in case. I find it surprising that AA will make that much of a difference, perhaps it does.

http://www.computerbase.de/artikel/grafikkarten/2010/test-amd-radeon-hd-6970-und-hd-6950/

At 1920x1200 with AA and AF it does make a difference. I love their reviews because their charts recalculate based on the card you set as the baseline. Just mouse over the name of the card and the rest of the chart is adjusted on the fly.
 
Last edited:
They're only measuring average frame rates...
Minimum frame rates are far more important to me. What's the point in having an average of 120ps if there's a section that only runs at 30, when the card averaging 100 can manage 50 in that section? You'd never know without some from of minimum frame rate indicator.
 
They're only measuring average frame rates...
Minimum frame rates are far more important to me. What's the point in having an average of 120ps if there's a section that only runs at 30, when the card averaging 100 can manage 50 in that section? You'd never know without some from of minimum frame rate indicator.
And what exactly does a "minimum" tell you?
It doesn't tell you how long the game ran at "30 fps", when, where, etc. It could have been a 20 minute demo where a small hard disk read and/or game stutter (un-related to graphics card) caused a 30 fps dip for a fraction of a second.

In order to actually use minimum fps, you'd need to look at the complete charted data (HardOCP does this).
At best, it seems like you're trying to form an analysis of a graphics card by examining outliers.

What's the point in having an average of 120ps if there's a section that only runs at 30, when the card averaging 100 can manage 50 in that section?
Does this happen? Ever?
 
This is why the best review sites use graphs, so you can see where these dips occurred, and how often.
Take a look at the graphs for Mafia II as an example. You'll notice there's a section in the test where you get a crazy frame rate, around 300. If one particular card happens to have a massive lead here, say 300 versus 200, where they're otherwise equal, this will have a considerable effect on the average frame rate, and you'd never know. Likewise, in cases where memory is insufficient (unlikely at 1920x1200, but far more prevalent at higher resolutions) you get very poor minimums often in several places. Again, you'd never really notice that with average frame rates. I'm not calling for average frame rates to be gotten rid of entirely, but on their own they do not tell enough of the story to be calling superiority on differences of 10% or less. If it's an epic landslide and a card is getting double the frame rate, sure, that's probably all you need to know, but for debating the finer details of cards which irrefutably perform very alike, it just isn't adequate.
 
You'll notice there's a section in the test where you get a crazy frame rate, around 300. If one particular card happens to have a massive lead here, say 300 versus 200, where they're otherwise equal, this will have a considerable effect on the average frame rate, and you'd never know.
Actually it has a minimal, if not completely unnoticed, effect.

Let's assume it's a 7 minute benchmark demo, and the huge spike occurs for 3 seconds.
3 seconds out of 420 seconds = .714% of total time.

One card gets 200 fps during that time, another gets 300 fps.
This is a 50% increase.

If both cards are averaging 30 fps at all other times, this will add 2.142 fps to one card, and 1.428 fps to the slower card, giving the final results:
32.142 fps
31.428 fps

Assuming these tests have a margin of error 3%, those results are still valid --> the cards are identical (2.2% error).
Ideally, these things should never occur, the fact that they did (or do) ever occur, means the benchmark needs to be fixed.
 
Suppose it's more than a 3 second spike? Again, without a graph you wouldn't know. Even with just this small spike, 2% is the difference between in the HD6950 and GTX570 at 2560x1600.
 
Suppose it's more than a 3 second spike?
The spikes shouldn't exist at all. If the spikes get worse, then you have an increasingly invalid benchmark result, which means we should all start questioning the reviewer's testing abilities. :p
I was just demonstrating that there is some tolerance. Some.

The spikes can easily be sliced out of the sample data.
Assuming it's not part of the test, for example, a loading screen/cinematic/fade to black (200+ fps) situation. If it is a problem like that, then minimum/average/maximum data is all subject to those kinds of errors. So one kind of data isn't more valid than another, when we start talking about bad benchmarking techniques.

If it is part of the test (the game itself), then where do we draw the line? 200 fps is too much for a game? 300 fps? 60 fps? 80 fps? Who gets to choose? And since we're reviewing raw processing power for these cards, why aren't 200+ fps results valid? It's still being rendered by the card (power).
 
Get the 6850 or a 6870. The 6950 isn't really suited for anything under 2560x1600.
 
Well, it is, but two HD6850s would be better if you need more processing power at that resolution. For those who don't want to go crossfire for whatever reason, an HD6950 is still a perfectly valid purchase. Until the GTX560 comes out it's completely unparalleled.
 
Well, what I meant was that the 6950... or 69xx series rather under-performs at anything below 25x16. They don't stretch their legs until you raise the resolution.
 
In a fair few games I suppose that's true, but anything that actually needs that much GPU power, they seem to do alright. They just don't push the absurd frame rates that aren't necessary in older titles.
 
Personally, I'd look at one of the GTX 460 1Gb cards that are going for around 155-170usd AR. Example:

http://www.newegg.com/Product/Produ...2055&cm_re=GTX_460_1gb-_-14-162-055-_-Product

Most do 810-850 on the core with little trouble.

If you're an AMD fans there's always the Asus HD 6870 too.

http://www.hardwareheaven.com/revie...graphics-card-review-power-temp-noise-oc.html

http://www.newegg.com/Product/Produ...418&cm_re=asus_hd_6870-_-14-121-418-_-Product

They're like 220usd AR.

An even better idea might be to wait and see what the GTX 560 can do. It's due soon.
 
Last edited:
How about the 6950 1GB? Although IDK when it is being released or if AMD put an end to the unlocking of shaders.
 
Back
Top