BFGTech GeForce 6800 Ultra OC Review @ HardOCP

burningrave101 said:
I dont discredit it because its the [H], i discredit it the same way anyone supporting the X800XT PE would try to discredit a site that showed the 6800u to be the obvious winner.

If Tomshardware, Anandtech, Tech Report, X-bit Labs, Guru3D, HotHardware, or any other site reviewed the X800XT PE in the same way and showed it to perform the same way over the 6800u, i would discredit them the same way.

Its not because i have something against the site, its just that i dont agree with their findings and the way they found them. Most of you would feel the same way if [H] was showing the 6800u the obvious winner and you favored ATI. But since its not you dont have a problem with it.

I'll challenge any site out there that i dont think is showing the same thing as the others, especially if they are using a new unproven method for reviewing that TONS of people on other forums disagree with.

If i didn't like the [H]ard Forums and think that they had alot to offer as far as knowledgeable resources, i wouldn't waste my time here. That doesn't mean i have to agree with everything they think though.

But what do you gain from this? You do realize that as subjective as this review is your opinion is also subjective. You can say that you don't like it all day long. But you haven't given an inch of proof that what they are doing is wrong. The H has a huge readerbase, do you know why? Because they have a good reputation. How do you get a good reputation, you ask? From not letting people down.

You think that the 6800u can do 1600x1200 4xAA 16xAF and remain playable throughout games. The H thinks differently. YOU DO NOT HAVE A 6800U TO ARGUE WITH! And if you did, it would be YOUR OPINION. And as much as you lean to one side, I wouldn't trust your opinion, as I don't guru3d, and a few others.

The reason I ask why you do what you do is this, you are wasting your time. Everywhere you go you seem to make it some big objective to change the world of Hardocp Ati Loving Idiots into little Burningraves. It's not going to happen. We chose to believe who we want to believe. By being so blatently biased, you have made no one want to believe you. Just think about it, it's starting to get silly. No flames to you. I think you're a great person. I just hate for you to go through all of this trouble without a prize at the end.
 
Brent, I have a question for you. When you're talking about gameplay, you're actually in the game playing, right? When you say, the ATI card plays better at this setting and the Nvidia card plays better at this setting, you're talking about how the game "feels" while playing it, not just looking at the numbers for a performance indicator, correct? Sorry if this has already been asked, but I'm curious (but not enough to re-read all 11 pages of this thread)... :D
 
wtburnette said:
Brent, I have a question for you. When you're talking about gameplay, you're actually in the game playing, right? When you say, the ATI card plays better at this setting and the Nvidia card plays better at this setting, you're talking about how the game "feels" while playing it, not just looking at the numbers for a performance indicator, correct? Sorry if this has already been asked, but I'm curious (but not enough to re-read all 11 pages of this thread)... :D

yep

gameplay = everything = image quality + performance

i play through the entire game to find the settings that play the entire game best

our graphs represent some of this gameplay
 
The 6800 is Over Clocked .........With cat 4.7s (betas are out) what would X800XT do iver clocked ? They are both nice and both have strong points . FPS her or there wont be seen in GAME PLAY.

A ford or a Chevy ?
 
Maybe you could spell this out more in future video card reviews. I think it was implied, but not explicit in this review. I totally agree that gameplay is why we buy these expensive toys in the first place. Who cares about numbers and graphs when you have someone actually saying "here is how it played"... ;)
 
agar said:
Burning you want benchmark's with reproducible results?

Here is a nice lil summary of the current performances between the X800XT/PE vs. 6800U and some additional 6800GT vs. X800pro benchmarks with Farcry v1.2(sm3.0) at high high res, high AA/AF.



Links:

http://www.rage3d.com/board/showthread.php?t=33768218
http://www.anandtech.com/video/showdoc.aspx?i=2102&p=1
http://techreport.com/etc/2004q3/farcry/index.x?pg=3
http://www.firingsquad.com/hardware/far_cry_sm30/default.asp
http://www.xbitlabs.com/articles/video/display/farcry30.html
http://www.hothardware.com/viewarticle.cfm?articleid=550

Yea. I've read them all and have alot more then that stored away.

There is something you should remember about the X800's though. 16x AF is the only thing saving them in performance against teh 6800GT and 6800u. The 6800GT beats the X800Pro every single time without high AF enabled. And it beats it every time when just AA is enabled. The GT was out ahead of the X800Pro by 5-20 fps in every bench at 1600x1200 without AF.

The AF performance that the X800's are seeing could be nothing more then brillinear filtering tricks.

Brilinear - Simply Filter Less

Brilinear filtering represented the next optimization step. It's a mixed mode between bilinear and trilinear filtering, or in other words, it involves less filtering. The area in which neighboring mipmaps are blended through the trilinear filter is simply reduced.

It allows some savings in computing time. In real games, the differences are usually very small and can only be detected when compared with images using proper trilinear filtering.

This type of texture filtering was introduced by NVIDIA, whose complete GeForce FX 5xxx series filters in this manner. Even NVIDIA's new GeForce 6xxx series filters like this by default, but NVIDIA has reacted to past criticism and now offers the user an option to switch this optimization off.

In the past ATi had the reputation of offering better image quality, thanks to proper trilinear filtering. This was also true, until the Radeon 9600, alias RV360, was introduced. Things changed with RV360. The chip now filters brilinearly too, although ATi stresses that this is an adaptive algorithm. The driver determines the degree of optimization using the variability of the mipmaps. ATi's new R420, alias Radeon X800, also uses this optimization.

What is annoying is that ATI did not bother explaining this filtering procedure. Reviewers did not notice this new ATI filtering technique because standard filter quality tests using colored mipmaps don't show this behavior. The driver switches to full trilinear whenever colored mipmaps are used.

http://www.hothardware.com/viewarticle.cfm?page=6&articleid=550&cid=2

There has been alot of talk lately on the forums about ATI's optimizations their using in their X800's.

ATI is cheating on trilinear texture filtering with the new X800, according to posts in Internet forums. Others ardently defend ATI. The heated discussions are fatally reminiscent of the cries of "cheater" that rang out last year against NVIDIA.

An article on the German Website Computerbase triggered the discussion. The article showed how ATI uses an optimized trilinear texture filtering, often dubbed "brilinear" - i.e. a mixture of bilinear and trilinear - in the Radeon 9600 graphics processor and in the X800 graphics chip based on this architecture. This was news, since ATI always claimed to provide true trilinear filtering.

Conclusion

All Filter optimizations discussed here aim to increase the performance of the graphics cards without materially reducing image quality. The word "materially" is, however, subjective - depending on the optimization used, a loss in quality is perceptible when taking a closer look. Even if the quality in screenshots is OK, a running game is often a different chapter. Annoying effects (moiré, flickering) can crop up that were not noticeable on screenshots.

In the case of graphics cards in the medium and lower price segment, the customer will certainly get added value in the filter optimizations, because "correct" filtering would slow the chips down too much. The user can play in higher resolutions or add filter effects that without the optimizations would be unplayable. The bottom line is that the customer ends up with better image quality.

It's a different story with the new enthusiast cards, such as the Radeon X800 Pro/XT and the GeForce 6800 Ultra/GT. With those cards the optimizations do not provide the customer with new added value - on the contrary. He gets a reduced image quality, although the card would actually be fast enough to deliver maximum quality at what would surely still be an excellent frame rate. We cannot escape the impression that the filter optimizations in the new top models will no longer be used ultimately to offer the customer added value, but rather solely in order to beat the competition in the benchmark tables, which are so important in the prestige category. Whether or not the customer will be ready to spend $400-$500 for this is quite another matter. NVIDIA has obviously realized this and allows true trilinear filtering as an option in its newest models. Well, it did not work in the latest v61.11 beta driver because of a bug... let's hope it indeed is a bug and will work again in the final driver release.

To be continued? One would assume so, because the discussion about the recently discovered brilinear filtering in the Radeon 9600 and X800 is still going full steam ahead. ATi deserves credit for the fact that the image quality of the cards is not visibly compromised by this filtering; at least no example has yet been seen of this. So far, the brilinear areas have only showed up in laborious tests. However, ATi is currently not offering true trilinear filtering with the cards mentioned above, whether adaptive or not. Because of the new filtering, the performance values of the benchmarks do not show the true potential of the X800, because the FPS values only occur due to an optimization whose details are unknown. Even the word adaptive has a bitter aftertaste. ATi has not provided information about the way the driver works and has declared numerous times that it is offering true trilinear filtering. Only since the discovery was made has ATI admitted that the filtering is optimized. Hopefully this type of adaptivity is not being used in other places in the driver .

However, slowly but surely manufacturers are moving to the point where tolerable limits are being exceeded. "Adaptivity" or application detection prevent test applications from showing the real behavior of the card in games. The image quality in games can differ depending on the driver used or on the user. The manufacturers can therefore fiddle with the driver, depending on what performance marketing needs at a given moment. The customer's right to know what he is actually buying therefore falls by the wayside. All that is left for the media is to limp along with their educational mission. The filter tricks discussed in this article are only the well-known cases. How large the unknown quantity is cannot even be guessed.

Every manufacturer decides for itself what kind of image quality it will provide as a standard. It should, however, document the optimizations used, especially when they do not come to light in established tests, as lately seen with ATi. The solution is obvious: make it possible to switch off the optimizations. Then the customer can decide for himself where his added value lies - more FPS or maximum image quality. There is no real hope that Microsoft will act to police optimization. The WHQL tests fail to cover most of them and also can be easily evaded, read: adaptivity.

Still, the ongoing discussion also has its benefits - the buyer, and perhaps, ultimately, OEMs are being sensitized to this issue. Because the irrepressible optimization mania will surely continue. However, there are also bright spots in the picture, as demonstrated by NVIDIA's trilinear optimization. We hope to see more of the same!

http://graphics.tomshardware.com/graphic/20040603/ati_optimized-12.html

Just a little food for thought for those that actually care.

And for those of you that havn't seen it already, you should check out the last post in this thread that is now stickied at the top of the forum.

http://www.hardforum.com/showthread.php?t=773746&page=5

And BTW, what is the deal with our choice in selections for stickies? We just had the "Official Catalyst 4.6" sticky and now we have the new "Official Catalyst 4.7" sticky in the main forum and the ATI forum and the ATI forum has the " VisionTek Radeon X800Pro Review" stickied but the BFG 6800U OC Review wasn't stickied in the nVidia section and nither is anything about any of the drivers.

We dont like nVidia related stickies at HardOCP or something?
 
Well, I've had both an X800 pro VIVO (modded to an XT via a bios flash from gigabyte..... this is real guys, save yoruselves the money and get an X800 Pro VIVo and just flash it...... just go to xtremesystems.org and you'll find the thread started by Codeman, everyone's has been successful) and a 6800 GT. I can't offer any bencharks, I'm too lazy for that, and , besides some anomalies with the GT that I'm sure future drivers will fix, the GT feels so much more smoother besides my overclocked VIVO hitting over 13K in 3dMark03 and the GT only hitting 12K.

For example, I always used to get a slight stutter effect with NBA Live 2004 with the X800 (after about 5 minutes of play) but with the 6800 GT it would never stutter. I think Nvidias design this time around is much better, and its only a matter of time before they realize its true potential.

I remember when I first bought the 9700 Pro, and at first it was buggy. But over time as drivers started getting better, the 9700 pro was THEE card to get.
 
whoa quicksilverXP, your saying yout 6800GT feels more smoother in some games than a x800pro @ XT speeds?

interesting. do you have any game numbers/benches?
 
Here is a post i just read over at nV News concerning Far Cry.

Actually to do it correctly on ATI hardware, both the AA & AF are set in-game, not thru the control panel. Setting AF thru the control panel gets you additional performance but lower IQ as Trilinear filtering is only done on the first stage. When possible, always set your AF in-game on ATI hardware. I don't think it should matter on AA though. If everyone sets their aa & af in-game nothing is left to speculation as both are being applied the way the game designer wanted it, not the way Nvidia or ATI decided.
It doesn't really take too long to setup your custom configs and it's quite easy to use them in benchemall as you just point it towards where they're at.

http://www.nvnews.net/vbulletin/showthread.php?t=31421&page=4
 
Brent_Justice said:
you can't go higher than 4XAF in-game settings though

if you want 8X or 16X you have to use the control panel for each card

Brent, did you guys run R300 or NV30 mode in Far Cry for the 6800?
 
quicksilverXP said:
Well, I've had both an X800 pro VIVO (modded to an XT via a bios flash from gigabyte..... this is real guys, save yoruselves the money and get an X800 Pro VIVo and just flash it...... just go to xtremesystems.org and you'll find the thread started by Codeman, everyone's has been successful) and a 6800 GT. I can't offer any bencharks, I'm too lazy for that, and , besides some anomalies with the GT that I'm sure future drivers will fix, the GT feels so much more smoother besides my overclocked VIVO hitting over 13K in 3dMark03 and the GT only hitting 12K.

For example, I always used to get a slight stutter effect with NBA Live 2004 with the X800 (after about 5 minutes of play) but with the 6800 GT it would never stutter. I think Nvidias design this time around is much better, and its only a matter of time before they realize its true potential.

I remember when I first bought the 9700 Pro, and at first it was buggy. But over time as drivers started getting better, the 9700 pro was THEE card to get.

See, now this is the kind of feedback I like to see about the cards. Who cares what benchmark numbers are, I want to know how the cards play real games. What other games did you test with quicksilverXP? Did you notice a gameplay difference between the cards in anything else?
 
creedAMD said:
hold on brent, he's getting his rebuttal, I don't know why he doesn't just get Sheuy's people to call your people. lol!


http://www.nvnews.net/vbulletin/showpost.php?p=362551&postcount=48

CIWS said:
What I see is a lot of talk and insults about each other and very little actual discussion on the tech and hardware. I also note SEVERAL of the same names in these threads. I am tired of the complaints and I'm tired of seeing people here with nothing better to do than post in threads "chasing" the same people around to make more comments about them in different threads. So here's what I'm going to do.

If you see a thread that is posted and you think it is Trolling then use this link and report it to be handled by a Mod or Admin.

If you see another user making Flaming remarks or insulting other users then use this link and report it to be handled by a Mod or Admin. .

If you read a thread and do not agree with information posted in in then either post a reply with information to counter the argument, without flames or insults, or simply close the thread and do not post anything. There is NO REQUIREMENT that says you have to reply to a post, that choice is yours. If you cannot make a calm, sensible reply I suggest you do not reply.

I do not care how long you have been a member here on this forum. If you choose to ignore this warning and continue to make posts that cause problems here I will remove your posting privileges. This forum does not need a 24 / 7 Mod, it needs the problem posters to correct their actions or be removed.

http://www.hardforum.com/showthread.php?t=773746&page=5

Try and keep the discussion on a technical level. If you can't do that then take your worthless feedback elsewhere.
 
burningrave101 said:
No he didn't what??? :confused:

oh, no he didn't run the Nv30 in R300 mode, even though that would fix all of the graphical abnormalties that the 1.2 patch fixes, when you run it in R300 mode it takes away other optimizations and hurts performance.
 
Brent_Justice said:
you can't go higher than 4XAF in-game settings though

if you want 8X or 16X you have to use the control panel for each card

schuey74 said:
Yes, you actually can set AF to 8 or 16 via the config file. You will see 1,2,4, and custom when you enter the usual "configure far cry" program or in the game. Custom is whatever you set, 8 or 16.

I just installed Far Cry so i'm going to go try it out.

EDIT: I just tested it and it works just like he said. You CAN enable 8x and 16x AF through the Far Cry system.cfg file.

And NO i'm not posting my benchmark results for my nVidia 5900XT lol. I'll wait till my XFX 6800GT arrives this Firday :p.
 
burningrave101 said:
I just installed Far Cry so i'm going to go try it out.

EDIT: I just tested it and it works just like he said. You CAN enable 8x and 16x AF through the Far Cry system.cfg file.

And NO i'm not posting my benchmark results for my nVidia 5900XT lol. I'll wait till my XFX 6800GT arrives this Firday :p.

Please keep us posted on the numbers when you get your GT

Maybe then when you see with your own eyes you will realise that Brent did a quality review and stop trying to discredit him.. :)
 
FlyinBrian said:
Please keep us posted on the numbers when you get your GT

Maybe then when you see with your own eyes you will realise that Brent did a quality review and stop trying to discredit him.. :)

http://www.hardforum.com/showthread.php?t=774411&page=2

There are some people right there have tested Far Cry at 1600x1200 w/ "very high" settings.

Hey Brent! Did you see my post about how to enable 8xAF and 16xAF for Far Cry in-game?
 
He also had in game AA set to 2x , the 4xaa 8xaf doesnt mean anything in the control panel as everyone knows by now... HAVE TO USE IN GAME CONTROLS TO SET AA LVLS :)
 
FlyinBrian said:
He also had in game AA set to 2x , the 4xaa 8xaf doesnt mean anything in the control panel as everyone knows by now... HAVE TO USE IN GAME CONTROLS TO SET AA LVLS :)

AA is the only one you HAVE to enable in-game and thats just for nVidia cards. You can enable both through the CP for ATI cards.

You can enable AF in-game as well for better IQ with less performance.
 
FlyinBrian said:
He also had in game AA set to 2x , the 4xaa 8xaf doesnt mean anything in the control panel as everyone knows by now... HAVE TO USE IN GAME CONTROLS TO SET AA LVLS :)

1600x1200 2xAA 8xAF....found to be perfectly playable on a GT.

Now you're telling me you I should expect to play at 1 full rez lower with my Ultra. Denial is not just a river in Africa.
 
Vagrant Zero said:
1600x1200 2xAA 8xAF....found to be perfectly playable on a GT.

Now you're telling me you I should expect to play at 1 full rez lower with my Ultra. Denial is not just a river in Africa.

This thread was done a while back where he didnt have the right setting yet, he proceeded to fix his setting then took screen shots... anyways we been on this all day frankly im just sick of discussing it and im sick of your flaming...
icon_butt.gif
 
Brent_Justice said:
23 if you have your forum properties set to 10 posts per page like me :D

Brent, you still have acknowledged the fact that i told you how to enable 8xAF and 16xAF in-game for Far Cry.

You just go into the Far Cry folder to the system.cfg file and alter the following:

r_FSAA = "1"
r_FSAA_samples = "4"
r_Texture_Anisotropic_Level = "8"

NOW you have 4xAA + 8xAF enabled in-game. Now your not getting the performance boost from enabling it in the CP. Enabling it in the CP sacrifices some IQ at least on the ATI hardware. Thats just one of the many ways ATI gets their performance in AF.
 
FlyinBrian said:
This thread was done a while back where he didnt have the right setting yet, he proceeded to fix his setting then took screen shots... anyways we been on this all day frankly im just sick of discussing it and im sick of your flaming...
icon_butt.gif

Yes, with the control settings he was at 1600x1200 2xAA 8xAF [AF works in display properties]. If I bother you so much might I suggest you put me on ignore. Heaven forbid I don't want you to burst a blood vessel on account of little ol' me. Why that would simply be...really funny.
 
I think ill just pour a beer on it and I wont need a new card. No but seriously , my cat tiped over a glass of water onto my tower and FPS have gone way up.............Dont try it but, for real way up.
 
burningrave101 said:
Brent, you still have acknowledged the fact that i told you how to enable 8xAF and 16xAF in-game for Far Cry.

You just go into the Far Cry folder to the system.cfg file and alter the following:

r_FSAA = "1"
r_FSAA_samples = "4"
r_Texture_Anisotropic_Level = "8"

NOW you have 4xAA + 8xAF enabled in-game. Now your not getting the performance boost from enabling it in the CP. Enabling it in the CP sacrifices some IQ at least on the ATI hardware. Thats just one of the many ways ATI gets their performance in AF.

i knew of this long ago, back when the demo came out

we set the game settings via the in-game menus, or driver control panels

no tweaking of the .ini or .cfg files unless we are testing something very specific and then we would seperate it from the regular highest playable gameplay evaluation section
 
Brent_Justice said:
i knew of this long ago, back when the demo came out

we set the game settings via the in-game menus, or driver control panels

no tweaking of the .ini or .cfg files unless we are testing something very specific and then we would seperate it from the regular highest playable gameplay evaluation section

Your not "tweaking" the .cfg file, your just enabling the same settings that you enable using the CP except when you do it this way ATI doesn't get their little performance cheat by lowering the IQ.

schuey74 said:
Actually to do it correctly on ATI hardware, both the AA & AF are set in-game, not thru the control panel. Setting AF thru the control panel gets you additional performance but lower IQ as Trilinear filtering is only done on the first stage. When possible, always set your AF in-game on ATI hardware. I don't think it should matter on AA though. If everyone sets their aa & af in-game nothing is left to speculation as both are being applied the way the game designer wanted it, not the way Nvidia or ATI decided.

I think whenever its humanly possible, AA and AF should be enabled in the game on both nVidia and ATI hardware instead of in the Control Panel.

Brent_Justice said:
you can't go higher than 4XAF in-game settings though

if you want 8X or 16X you have to use the control panel for each card

And if you already knew about it, why didn't you tell me about it earlier instead of just saying you have to use the CP to enable it?
 
burningrave101 said:
Your not "tweaking" the .cfg file, your just enabling the same settings that you enable using the CP except when you do it this way ATI doesn't get their little performance cheat by lowering the IQ.

Its comments like this that get to me. If you remember both use af texture stage optimizations. They both use adaptive af and a reduced trilinear filter so comments that ATI is the only one that uses AF ops is nonsense. BTW, there is a discussion on this at beyond3d about ATi's AF efficiencies and that they produce less of a hit when AF is enabled one theory, among others, is due to their separate texture address processor. I think no one really knows as of yet why they are getting better performance, but most everyone agrees ATi's filtering ops do not reduce image quality. There must be something else that is factoring in.


http://www.beyond3d.com/forum/viewtopic.phpt=13857&postdays=0&postorder=asc&start=0
 
Hawg-dawg said:
es

Whats with the penises
Isnt my vid card an extension of ....................

Why sure!! ;) Isn't everyone always arguing about whose is faster (bigger)??? It's a guy thing. :D
 
Spank said:

The topic just goes back and forth, back and forth, back and forth. Noone really has a solid answer.

I'm going to keep looking into this on different forums and see if someday we can't come up with an answer to the equation lol.

What we really need is a PR spewing REP from nVdia to come out and tell us whats going on and what ATI is doing just like Humus does for ATI. lol ;)

Mintmaster said:
As for AF, I don't know. NVidia is accusing ATI of cheating, and some people have noticed more aliasing with R420's AF versus R300's. It makes sense that clockspeed is part of the equation, though. I'm waiting for ATI to carry these optimizations over to the OpenGL driver.

http://www.beyond3d.com/forum/viewtopic.php?t=13857&postdays=0&postorder=asc&start=0

Thats my story and i'm stickin to it. :p

ChrisRay said:
On a minor note, ATI has used texture stage optimizations in the past with Control panel AF, Are they still employing these? Nvidia these are disabled by default, However you can still turn on the AF optimizations for Nvidia.

Yes they are still employing these. :D

Thats why i am saying, STOP enabling AF in the CP on ATI Hardware when you can enable it in-game.
 
Back
Top