Radeon HD 3870 on QX9650 sneak peak

Wait n see, the 2900 XT had bad performance, i bought the GTS to start with, but the netshop did some expanding and was delayed, and the linux driver from ati was released with great support, bought ati, the 7.10 CCC was released, omfg, i gained so much more,ati sucks at drivers, but their cards rock.

Thumbs up for the cards, except 2900 XT's powerusage.
Thumbs down for their drivers.

Thumbs up for nvidia's timing of releases.
Thumbs up for nvidia's Drivers.
 
You're joking, right...? I mean, I have an X2 3600+ in this box, but let's see...Socket 754, Socket 939, Socket 940, AM2 (which has 940 pins if I recall but a different pin layout, so it's incompatible), AM2+, AM3, FX1207...and the list goes on. Whereas Pentium 4, Pentium D, Core Duo, Core 2 Duo...were all on LGA 775. Intel is NOT the worst offender when it comes to changing sockets every 6 months.

I'm looking towards the future not the past. I'm not sure I want to buy the expensive DDR3 so I may go with a AM2+ chipset instead. Keep it for 2 or so years and go with whats best after that. The X38 will not allow me that option.
 
You're joking, right...? I mean, I have an X2 3600+ in this box, but let's see...Socket 754, Socket 939, Socket 940, AM2 (which has 940 pins if I recall but a different pin layout, so it's incompatible), AM2+, AM3, FX1207...and the list goes on. Whereas Pentium 4, Pentium D, Core Duo, Core 2 Duo...were all on LGA 775. Intel is NOT the worst offender when it comes to changing sockets every 6 months.

Reason: Integrated memory controller. Once Intel goes that route you'll see sockets changing every 6 months as the RAM specs change.

And, 1207 is their Server socket, AM3 chips are backwards compatable with AM2 motherboards (I believe). And second to the poster a couple posts above above: Intel may not change physical sockets, but I'll be damned if you didn't STILL had to buy a new motherboard every 6 months!
 
Uhm, i'd say ATI and nvidia is a close combat, for the users who say, nvidia still got the best card, Ultra, personally i wud say the Ultra is some piece of **** to be honest, better to do CF or SLI with 8800 GT or 3850 then.
Ultra isnt a card i enjoy seing benchmarks of, cause they cost too much.

And i thank Hardocp to just have some small performance tests from theese cards, and keeping the focus on the main consumer cards, ofcourse its nice to see benches of all cards thats out there.

for same price.


Well hope amd goes 45 NM for AM2+, i hope, i hope.
 
where do you see that they did not fix the aa performance :p?

20071109_4eab719d5ee22d6cff852x0Ohb9r0kTL.jpg
 
bandwidth limited? it has 76 GB/sec, possibly. then again, even it was bw limited that would only go to support the idea that AA is not fixed, since the 2900 XT has 30GB more bandwidth but is only about 1FPS faster.
 
bandwidth limited? it has 76 GB/sec, possibly. then again, even it was bw limited that would only go to support the idea that AA is not fixed, since the 2900 XT has 30GB more bandwidth but is only about 1FPS faster.

yea,but you can see that the 3870 does perform faster in most other scenarios, so something was changed for it to have even a 4fps lead from 20-24fps and i dont think a less then 5% increase in core speed is going to account for a 15% increase in fps
 
yea,but you can see that the 3870 does perform faster in most other scenarios, so something was changed for it to have even a 4fps lead from 20-24fps and i dont think a less then 5% increase in core speed is going to account for a 15% increase in fps

yes those other scenarios as you put are withOUT any AA. and yes the core speed difference cannot account for these non AA situations. my guess is texture performance has been tweaked in the 3800, but cant be sure. we'll know soon enough. :)
 
yes those other scenarios as you put are withOUT any AA. and yes the core speed difference cannot account for these non AA situations. my guess is texture performance has been tweaked in the 3800, but cant be sure. we'll know soon enough. :)

my point was that if this slide is right, then they def did make some changes to the gpu to make it more effecient, so they probably did some work on the AA
 
I'm not impressed at all. Look how long the 8800 cards have been out and ATI's newest baddest card can't compete? Overall that is.

AMD has been slacking. Intel and Nvidia both have been kicking their butts all over the place. They need something quick in both camps to keep them viable. The Phenom and 3870 isn't going to do it.

I have always been an AMD fan (ever since 586 days), until they slacked off and the Core 2 Duo kicked so much butt. I have always hated ATI though, not because of their hardware but, because of their crappy drivers. Now their hardware and software are lacking.

AMD/ATI has a long way to go to catch up and pass so that people will be flocking to them again.
 
Considering the 3870 will be compared to the 8800gt which came out like a week and a half ago, I don't consider that a long time. This is not supposed to be a flagship card at all, it is supposed to be mainstream. The newer flagship cards probably won't be released till next year.
 
There's no denying the far superior looking cooler on the ATI part tho. I really wish NV would have done something like that w/ the GT. :( Keeping gpu heat in the case is just wrong.
 
hahaha, look at the scale on that graph.... 24 is about 4" longer than 20 on the last set of bars. Aha.
 
my point was that if this slide is right, then they def did make some changes to the gpu to make it more effecient, so they probably did some work on the AA

well they might have but everything on that slide points to no. so don't hold your breath, they more than likely did nothing for AA.
 
well they might have but everything on that slide points to no. so don't hold your breath, they more than likely did nothing for AA.

The only logical answer is that its bandwidth limited.

Look at the slides: it beats the 2900XT by 10-20% considerably in every game without AA at 1920 x 1200. With AA, it starts falling behind.

Now, think about this: even if the 3870 had NO revisions to the core (which of course it does, since it uses fewer transistors etc. so changes *were* made) it would beat the 2900XT consistently since you're saying its the same core, only with higher clocks.

Now look at 8800GT benchmarks and compare its 1920x1200 performance w/ AA and 2560x1600 performance w/ and w/o AA. You'll notice that the higher bandwidth GTS640's can actually match and sometimes beat the 8800GT in those games. Then take it up to the 8800GTX and Ultra and you'll see that the more bandwidth makes an even bigger difference.

Both 8800GT and 3870 have 256-bit memory bandwidths so it looks like it's a case of bandwidth limitation, and not core performance.
 
ehm...

ATI CAN COMPETE, lets look up, or back, this is a very active thread.


Ati cards is VERY fast, ati drivers are the worst there is :p

i dont remember what game it was, but i got like 30 fps more in a game after a driver update, theese cards have more up their sleeves ;)


I would agree with BW, but the 3870 had 512 bit didnt it ?, and the 3850 had 256 bit ?'


But well, ati got like 100 watts gphx cards now WOHOO, ati got crossfire capability with 2 chipset makers.

its only to wait n see :p, btw the ati cards make less noise and is cooler =)

but, i think both the 8800 GT and the 3850/3870 got something more, ati cards show lesss of their real performance I guess, cause ati just suck at drivers.
 
3870 is 256-bit as is the 3850

R680 / 3870X2 sounds interesting though...
 
i thnk we should thank amd for the current situation. R600 was WELL into its development when AMD bought ATi. There was no way AMD would have turned things around within a year of a release of a video card. We should probalby thank AMD pushing the process tech down to 55nm as fast they possibly can with great results, thus the early release of rv670.

Next up, r680!
 
i thnk we should thank amd for the current situation. R600 was WELL into its development when AMD bought ATi. There was no way AMD would have turned things around within a year of a release of a video card. We should probalby thank AMD pushing the process tech down to 55nm as fast they possibly can with great results, thus the early release of rv670.

Next up, r680!

its not early! its supposed to launch now, R600 was late :p
 
sexy like pic on the card :) hope performance will be as good as it looks i am between 8800gt and this card
 
I desire that card. Now.

I so wonder about Crysis dx 10 performance with 2x3870 on X38 board.
 
its not early! its supposed to launch now, R600 was late :p

haaha good one. It was reported that even AMD themselves were surprised when the first silicon came back bug free. ANyways, a cool and decent running card is an acceptable release to me.
 
Maybe the HD 3870 couldn't compete with 8800GT but nVidia won't have any chance against the HD 3850 with their 8600GTS :D
 
That is what the 256 ram 8800 GT is for. It'll probably out do the 3870 by a long shot lol.
 
I'd like to highlight that the picture was not taken by me. I merely found it somewhere (that thread) on the internet.
 
The only logical answer is that its bandwidth limited.

Look at the slides: it beats the 2900XT by 10-20% considerably in every game without AA at 1920 x 1200. With AA, it starts falling behind.

Now, think about this: even if the 3870 had NO revisions to the core (which of course it does, since it uses fewer transistors etc. so changes *were* made) it would beat the 2900XT consistently since you're saying its the same core, only with higher clocks.

Now look at 8800GT benchmarks and compare its 1920x1200 performance w/ AA and 2560x1600 performance w/ and w/o AA. You'll notice that the higher bandwidth GTS640's can actually match and sometimes beat the 8800GT in those games. Then take it up to the 8800GTX and Ultra and you'll see that the more bandwidth makes an even bigger difference.

Both 8800GT and 3870 have 256-bit memory bandwidths so it looks like it's a case of bandwidth limitation, and not core performance.

first, the nvidia cards are a completely different architecture so they have no relevance to the discussion between the r600 and rv670. thats that. I already stated bandwidth limitation is a possibility. and I never said anything about core performance affecting anything. the only thing I said already is the extra 30mhz cannot account for difference. but the more important thing to notice is with AA, while it does "fall behind" the 2900 XT, its only one or less than one FPS slower, even though it has about 30% less bandwidth. so its obvious that if it had the same bandwidth as a 2900 XT, it would be about 1 FPS faster.

what i wanna know is what is causing it to take a lead WITHOUT any AA? and my guess is as I already stated, ATI improved texture performance. though I don't really care, because I run 8xaa normally and 4x being absolute minimum.
 
The AA was fixed hence the 10.1.The HD3870 is slightly slower than the 8800gt and will be obliterated by the new gts from Nv.Now this is the kicker ,neither gt or gts can do tri-SLI,which if money is an issue (gtx and ultra being at the prices they are )then quad-fire is a safe bet for the future.How many people are still using cards from 3 to 4 generations ago.When 10.1 becomes the norm ,the hd3870 will still be moving along,while everything now thats Nvidia will not be giving you the visuals of directX 10.1 when they then becomes the 3rd or 4th gen cards.
 
I'm disappointed, a good card from ATi would of caused GT prices to come down, but this is not the card. :(

My GT on a Dual Core E6600 @ 3.2 get a 3D06 of 13,063
With...
SM 2.0: 6220
HDR / SM 3.0: 6020

And you are telling me that this is only getting 12,961 with a Yorkfield at 3Ghz? http://resources.vr-zone.com/newspics/Nov07/09/12961.jpg
As said before in 3DMark06 of all places... :rolleyes:
 
Until we see benchmarks of the same system running with the 8800 gt vs 3870, 3dmark scores don't mean very much. Things like how the ram is optimized, how windows is setup and what mobo is being used can make a lot of difference to the final score.
 
When 10.1 becomes the norm ,the hd3870 will still be moving along,while everything now thats Nvidia will not be giving you the visuals of directX 10.1 when they then becomes the 3rd or 4th gen cards.
When that occurs, the HD3870 will be too low a performance card to play except at lowest settings.
 
When that occurs, the HD3870 will be too low a performance card to play except at lowest settings.

Yea a 100% agree, we won't even see 10.1 in games for probably 2 years+ (even then a tiny improvement). In fact we are not even seeing from the ground up DX10 games yet, ie the Crysis DX9 ini tweaks to turn on 99% of the features in DX10.

I hope people aren't buying into the 10.1 slides I have see so far, they are a tad laughable.

Buy a 3870 based on price/performance and forget 10.1 as it is nothing more than a marketing tool.
 
ZerazaX said:
Now, think about this: even if the 3870 had NO revisions to the core (which of course it does, since it uses fewer transistors etc. so changes *were* made) it would beat the 2900XT consistently since you're saying its the same core, only with higher clocks.

The only real change to the core transistor wise is the Ring Bus, going from 1024bit back to 512bit, that is a reason for the chunk of missing transistors.

LawGiver said:
what i wanna know is what is causing it to take a lead WITHOUT any AA? and my guess is as I already stated, ATI improved texture performance. though I don't really care, because I run 8xaa normally and 4x being absolute minimum.

Only real improvement came from increased core clocks, essentially being the same core, only clocked alot faster. This card should be alot faster then the R600's with out AA, the halved memory bandwidth should be the #1 reason for slower AA performance. Everything about this card so far points to the exact same core, smaller process and halved memory bandwidth.

ro3dog said:
When 10.1 becomes the norm ,the hd3870 will still be moving along,while everything now thats Nvidia will not be giving you the visuals of directX 10.1 when they then becomes the 3rd or 4th gen cards.

theres no visual or speed improvement in DX10.1, it caries new Audio spec, and stricter guide lines for what the video card must perform before it can be deemed "DX10.1." From the looks of it, Nvidia probably didn't want to confuse customers with new terminology, or their lack of on GPU Audio controller is holding them back (but I'm sure thats not part of a DX10.1 requirement).
 
Back
Top