Would ATi go as far as to influence crippling of 6800 HL2 performance to save face?

onetwo

Limp Gawd
Joined
Jan 18, 2004
Messages
440
In a neutral environment, ATi ate the big one in a game that we all knew would naturally play up to the hardware and driver strengths of the 6800 series. With HL2, there aren't any 'natural' advantages in favor of the X800 as both cards perform similarly in the majority of DX9 titles -- do you think that ATi would "strongly suggest" Valve to manufacture advantages that would allow the X800 to claim victory in a fashion similar to that of the 6800 in DOOM3?

With so much at stake, I wouldn't put it past ANY company to respond in that manner.
 
onetwo said:
In a neutral environment, ATi ate the big one in a game that we all knew would naturally play up to the hardware and driver strengths of the 6800 series. With HL2, there aren't any 'natural' advantages in favor of the X800 as both cards perform similarly in the majority of DX9 titles -- do you think that ATi would "strongly suggest" Valve to manufacture advantages that would allow the X800 to claim victory in a fashion similar to that of the 6800 in DOOM3?

With so much at stake, I wouldn't put it past ANY company to respond in that manner.

Ati would probably win until the first leaked nvidia driver.

This is where time was nvidia's best weapon. With doom3 coming out and the 6800 beating the shit out of the x800 everyone is going to buy 6800's now.

When HL2 comes out, most of the gamers are going to already be armed with 6800's so HL2 has to cater to the majority. They may give an edge but they aren't going to shun the majority of the market. ;)
 
Looks like it's time to get out the tinfoil hats.

Look, first. . . you can't say it's a "neutral environment" and then say that D3 was skewed towards NVIDIA. D3 is an OpenGL game. ATI's OGL driver sucks. That's hardly a deep, dark, conspiracy. I'd like to see some. . . any proof that NVIDIA had undue influence on id or D3 development.

I'd say the odds of Valve mangling their code to suit ATI is likewise unlikely.

And before I'm lumped in with the fanboys, I had a 9800XT until a week ago. And I was looking for a X800-XT until I gave up and bought a 6800GT. Now I'm happy that I happened to stumble into the right decision.

H
 
onetwo said:
In a neutral environment, ATi ate the big one in a game that we all knew would naturally play up to the hardware and driver strengths of the 6800 series. With HL2, there aren't any 'natural' advantages in favor of the X800 as both cards perform similarly in the majority of DX9 titles -- do you think that ATi would "strongly suggest" Valve to manufacture advantages that would allow the X800 to claim victory in a fashion similar to that of the 6800 in DOOM3?

With so much at stake, I wouldn't put it past ANY company to respond in that manner.
Would i put it passed ATI(or NV even) to try it? No.
Would I think that Valve would actually do it? No.

Sure ATI will make 'suggestions' to game publisher as to how best 'optomize' the game for thier cards. But for a game company to go out of its way to make a game less playable on one card versus another is insane, especially to do so to the card company with the dominant market share/install base...

I dont buy much into such paranoid conspiracy theories.

The 'cheating' will come from ATI/NVidia driver releases not the game company.
 
Hurin said:
Looks like it's time to get out the tinfoil hats.

Look, first. . . you can't say it's a "neutral environment" and then say that D3 was skewed towards NVIDIA. D3 is an OpenGL game. ATI's OGL driver sucks. That's hardly a deep, dark, conspiracy. I'd like to see some. . . any proof that NVIDIA had undue influence on id or D3 development.

I'd say the odds of Valve mangling their code to suit ATI is likewise unlikely.

And before I'm lumped in with the fanboys, I had a 9800XT until a week ago. And I was looking for a X800-XT until I gave up and bought a 6800GT. Now I'm happy that I happened to stumble into the right decision.

H

Where did he say that D3 was skewed towards NVidia?
 
Actually, I don't think they would have performance crippled.

They will probably make sure all their new technologies are exploited so that it looks better on the R3x0/R420 cards than on NVIDIA cards. I don't think performance will be very different on R420 vs NV40 otherwise.

NV3x performance of course is a lost cause, but might be better than expected with driver detection and "optimization." ;)
 
Even if they did try this (unlikely this late in HL2 development) I can't think of a big weakness with the 6800's they could exploit. My feelings are that no matter which you choose (Ati or Nvidia) the differences aren't going to be so huge that it will really affect gameplay in either HL2 or Doom 3.
 
The only way you are going to see a significant difference in the x800 and 6800 in HL2 is if Valve runs the 6800 in FP32 all the time IMO. While 6800 can still run FP32 fast, since it is an SM3.0 card it gets optimal performance from using FP16 when FP32 is not needed.

However Valve has already coded an FP16/FP32 path for HL2 so this shouldn't be an issue.
 
Maybe if ATI was making HL2... But no, Valve is making HL2, and I'm sure they wouldn't do something like that just like I'm sure id didn't cripple ATI cards. Despite the conspiracy theories, game developers care more about making the game work well accross all hardware (within reason).

I'm pretty sure that Valve just lies about release dates to try to take the press and hype away from other games that end up coming out first anyways. :)
 
creedAMD said:
Ati would probably win until the first leaked nvidia driver.

This is where time was nvidia's best weapon. With doom3 coming out and the 6800 beating the shit out of the x800 everyone is going to buy 6800's now.

When HL2 comes out, most of the gamers are going to already be armed with 6800's so HL2 has to cater to the majority. They may give an edge but they aren't going to shun the majority of the market. ;)
The funny thing they will be upgrading their 9800 pro and XTs while getting HL2 with their vouchers from those purchases. :D
 
joemama said:
Even if they did try this (unlikely this late in HL2 development) I can't think of a big weakness with the 6800's they could exploit. My feelings are that no matter which you choose (Ati or Nvidia) the differences aren't going to be so huge that it will really affect gameplay in either HL2 or Doom 3.

There is going to be almost no difference in performance for HL2 between the 6800's and the X800's. There is nothing they can exploit on the 6800's to make them run worse because the 6800's perform as well as the X800's in D3D. Half Life 2 will also support SM 3.0 in a patch after the game is released so that will give the 6800's even more performance.

DOOM3 on the other hand is different. Its OpenGL and supports UltraShadow II Technology. UltraShadow II is suppost to be 4x faster then the original. There is a pretty big cap in performance between the 6800GT and the X800Pro in DOOM3. nVidia's GT is beating ATI's top of the line PE. Of course thats understandable considering at times the 6800nu beats the PE in OpenGL.

I've seen the 6800nu beat the PE at 1600x1200 w/ 4xAA + 8xAF in Call of Duty and in Neverwinter Nights. If review sites started using more OpenGL games i'm sure we would see it even more.

OpenGL is alot better then Microsofts DirectX. OpenGL is capable of running better on older slower machines. With DirectX programmers can be lazy and let DirectX do alot of the work for them. With OpenGL you have to do it all yourself. OpenGL has less commands and is more efficient but takes alot more work. OpenGL is also portable to Linux and Mac whereas we know Microsoft will never support something like that.
 
i still think ati can fix this problem in a driver release.

i mean the cards perforem equally on all other tests. i think this will actually get ati's ass in gear and fix the open gl problems that have plagued there drivers.
 
Filter said:
i still think ati can fix this problem in a driver release.

i mean the cards perforem equally on all other tests. i think this will actually get ati's ass in gear and fix the open gl problems that have plagued there drivers.
It's not just a driver problem. The way Doom 3 renders (stencil pass, shadows, etc) favors the NV40. In some cases the NV40 performs the step faster (stencil pass) and in other cases the NV40 simply needs to do less work (UltraShadow II simplifies the scene in a way).

A driver update/rewrite will of course help somewhat, but it's not going to be a miracle.
 
Also the drivers used for the Doom3 benchmark are the latest and greatest 4.8 beta drivers supplied by ATI.
 
i wouldn't say cripple, but I would not be surprised if HL2 has all of ATi's goodies enabled first, with support for NVidia's extras patched into it later.

It will probably immediately have 3Dc and better smartshader support, but not have ultrashadow II and Nvidia's HDR. Sort of like the backward version of Doom3.

ATi does have an excellent renderer for thousands/millions of shadows, they simply call it "ambient occlusion" http://www.ati.com/developer/demos/rx800.html . The crowd demo is the example for it. It appears to be even quicker than Nvidias tech, its just that they simply do not hype it up with a name. Same thing with "natural lighting" which is ATi's HDR.
 
Ya ati is probably gonna run HL2 better than nvidia, but its for the same reasons that Doom3 runs better with nvidia, its cause each manufacturer's strongest features were coded into the game. With doom it was nvidia getting the special treatment with HL2 it will be ati.

Either way it dosen't matter, each game will end up running about equal, IMHO, just like with farcry..... patches will update the game to work better for each others cards. So either way dosen't make much of a difference.
 
creedAMD said:
Where did he say that D3 was skewed towards NVidia?

Here:

. . .would naturally play up to the hardware and driver strengths of the 6800 series.

That's like saying: "Well, yes. Carl Lewis won the 100-yard sprint. But that's just because he's faster than everyone else." :)

But, yes. My original remarks were a bit overstated. There is less implication that NVIDIA and id colluded than I originally interpreted. My bad.

But I still think that there's too much conspiracy theory peddling around here.

H
 
hmmyah said:
Ya ati is probably gonna run HL2 better than nvidia, but its for the same reasons that Doom3 runs better with nvidia, its cause each manufacturer's strongest features were coded into the game. With doom it was nvidia getting the special treatment with HL2 it will be ati.

Either way it dosen't matter, each game will end up running about equal, IMHO, just like with farcry..... patches will update the game to work better for each others cards. So either way dosen't make much of a difference.

What kind of special treatment is ATI going to get in Half Life 2? 3Dc? Wow. There will be a DXT5 fallback and there isn't much of a performance/IQ difference between the two. HL2 will also use SM 3.0 for the 6800's. I look for the performance to be dead even between the two.
 
no. ati wouldn't. that would be insanely lame. i, along with any other gamer, would never by their product again.
 
The x800 series will be stronger in HL2 for the same reason they are stronger in Farcry. It's just that simple. Also I doubt SM3 support is going to help nVidia a lot with respect to HL2 as people are just finding out that the Farcry 1.2 patch also has another new path which is bumping performance on the x800 series in equal amounts to that of the 6800 with SM3. It's also been revealed that there are more things in that pathway which haven't been fully enabled yet (supposedly they require new drivers and is probably what ati's terry was being so happy about earlier), so the performance jump could be even more than what is witnessed. So if even Crytek can find ways to improve performance significantly on the x800 series, you darn sure know valve will since they probably worked a lot more with ATI than Crytek did.
 
Hurin said:
Here:



That's like saying: "Well, yes. Carl Lewis won the 100-yard sprint. But that's just because he's faster than everyone else." :)

But, yes. My original remarks were a bit overstated. There is less implication that NVIDIA and id colluded than I originally interpreted. My bad.

But I still think that there's too much conspiracy theory peddling around here.

H

He edited his post as well. I see what you mean.
 
gordon151 said:
The x800 series will be stronger in HL2 for the same reason they are stronger in Farcry. It's just that simple. Also I doubt SM3 support is going to help nVidia a lot with respect to HL2 as people are just finding out that the Farcry 1.2 patch also has another new path which is bumping performance on the x800 series in equal amounts to that of the 6800 with SM3. It's also been revealed that there are more things in that pathway which haven't been fully enabled yet (supposedly they require new drivers and is probably what ati's terry was being so happy about earlier), so the performance jump could be even more than what is witnessed. So if even Crytek can find ways to improve performance significantly on the x800 series, you darn sure know valve will since they probably worked a lot more with ATI than Crytek did.


But the x800 and the 6800 are pretty even in farcry..
 
ZenOps said:
ATi does have an excellent renderer for thousands/millions of shadows, they simply call it "ambient occlusion" http://www.ati.com/developer/demos/rx800.html.
No. Results for "ambient occlusion" on ATI's site:
http://www.ati.com/developer/sdk/RADEONSDK/Html/Info/Extensions/GL_HP_occlusion_test.html
Overview

This extension defines a mechanism whereby an application can determine the
non-visibility of some set of geometry based on whether an encompassing set
of geometry is non-visible. In general this feature does not guarantee that
the target geometry is visible when the test fails, but is accurate with
regard to non-visibility.
...
http://www.ati.com/developer/sdk/RADEONSDK/Html/Info/Extensions/GL_ARB_shadow_ambient.html
Overview

This is based on the GL_SGIX_shadow_ambient extension and is layered
upon the GL_ARB_shadow extension.

Basically, this extension allows the user to specify the texture
value to use when the texture compare function fails. Normally
this value is zero. By allowing an arbitrary value we can get
functionality which otherwise requires an advanced texture
combine extension (such as GL_NV_register_combiners) and multiple
texture units.
nvidia also supports GL_ARB_shadow and GL_HP_occlusion_test.

compared to:
In fact, the new technology in UltraShadow II allows for a 4× performance increase (compared to the previous generation) for passes involving shadow volumes— without the developer having to do any work.
...
In the following figures (Figures 3, 4, and 5), notice how using UltraShadow II
substantially reduces the amount of shadow area that needs to be examined.
UltraShadow II increases performance by actually culling shadow pixels—the
hardware ignores shadow pixels that do not contribute to the final image.
See pages 4-6 of the technical brief to see how US II reduces work "for free." http://www.nvidia.com/object/feature_ultrashadow2.html
 
onetwo said:
In a neutral environment, ATi ate the big one in a game that we all knew would naturally play up to the hardware and driver strengths of the 6800 series. With HL2, there aren't any 'natural' advantages in favor of the X800 as both cards perform similarly in the majority of DX9 titles -- do you think that ATi would "strongly suggest" Valve to manufacture advantages that would allow the X800 to claim victory in a fashion similar to that of the 6800 in DOOM3?

With so much at stake, I wouldn't put it past ANY company to respond in that manner.


just because nvidia did?

thats low to suggest that.
 
WHAT THE HELL has this thread turned into? Nvidia beats Ati in doom3 benchies. I dont see what else there is too it. The END.
 
OpenGL games = nVidia.
Dx = used to bet ATI now seems even.
I got nVidia for 2 reasons...
1) Better OpenGL performance.
2) Can do bios mod instead of soldering for vcore increase.
 
creedAMD said:
But the x800 and the 6800 are pretty even in farcry..

Not really, the x800s are still generally quite a bit stronger in Farcry even with the SM3 pathway for the 6800. This is though ignoring the discovery that the 1.2 patch also has another SM2 pathway built into the game that seems to increase performance on the x800s by a good deal, as shown here.
 
Hurin said:
....
But I still think that there's too much conspiracy theory peddling around here.

H

Amen.

Anytime one flavor beats another flavor of vid card, the theorys of collusions and deception are brought up. Benchmarks are just that - "under these controlled condtions, here are the results". The new gen of GPU's are both great - it's more a matter of who gets your $400...

Peace,
Tim
 
Its really funny, before Doom 3 official benchmarks were released, people had their theories about this and that. But none of made any sense because people were just' guessing' HELLO ?!?!?! Now... it seems ATi fanboys are doing the same thing here. Who is anyone to say how a company or card will perfom on a game until offiical benches have been released ?

I get a strong feeling people who have bought a x800 series that are butt hurt about the D3 Benches are looking to jump on the HL2 will own Nvidia bandwagon.. when ...

1. When is HL2 coming out ?
2. Have you seen the game, any benchmarks ?

Guessing is just that.. guessing. Speculating is just that speculating ... wait until HL2 comes out, then we'll see.

I don't hate ATi... I used several Radeon series cards, including a 9500pro for years and loved them. Nvidia choked on the FX series but seemed to focus on this one and came out the 'winner' so far.

Strangly enough though .... I see people in this thread (ati fanboys) barking about small perfomance advantages, fps, etc.. yet ....... on the other threads about Nvida owning ATi @ Doom 3, ati fanboys state... PFFT ! what difference does 20fps do ? small perfomance this and that.

Looking at the situation from a buyers point of view... anyone smart would buy a 6800GT right now.
 
gordon151 said:
Not really, the x800s are still generally quite a bit stronger in Farcry even with the SM3 pathway for the 6800. This is though ignoring the discovery that the 1.2 patch also has another SM2 pathway built into the game that seems to increase performance on the x800s by a good deal, as shown here.

that is one german article, there are 5 or 6 review sites showing that the x800 and 6800 are pretty even with sm3.0 enabled on the 6800. IF there is a difference there is only a handfull of frames tops either way. That does not equate to "generally a quite abit stronger"
 
creedAMD said:
that is one german article, there are 5 or 6 review sites showing that the x800 and 6800 are pretty even with sm3.0 enabled on the 6800. IF there is a difference there is only a handfull of frames tops either way. That does not equate to "generally a quite abit stronger"

OMG damn you, why won't you read what I say (strangles creed)! The article isn't about comparing the performance of the x800 and 6800, but is about the improvements they saw from the new SM2b pathway in Farcry (something introduced with the new 1.2 patch). That's what I'm talking about! They enabled on the x800 the same additions they did for the 6800 (instancing and single pass lighting as well as some rumored 3dc support). That's why I was basically saying that they might not be on parity anymore with the new patch.
 
gordon151 said:
Not really, the x800s are still generally quite a bit stronger in Farcry even with the SM3 pathway for the 6800. This is though ignoring the discovery that the 1.2 patch also has another SM2 pathway built into the game that seems to increase performance on the x800s by a good deal, as shown here.

No their not lol. You need to get out and read some more reviews. There were around 15 or more on just Far Cry SM 3.0 performance. And since you havn't noticed already, the performance results were different in each review because alot of them made their own demo's and tested different parts of the level which gives mixed results. The X800's in no way shape or form whip the 6800's in Far Cry.

In Far Cry the 6800u beats the PE almost every single time in the different levels at 1600x1200. The PE doesn't start winning till high AF is enabled and then its just a matter of what level you test and how you test it as to which is faster. They both win an equal amount of the time in Far Cry.

The 6800GT beats the X800Pro the majority of the time in Far Cry.

And the 1.2 patch sucks for ATI users lol.

FarCry 1.2 Patch Issues:
Since posting the FarCry 1.2 patch yesterday I have received a ton of e-mail from ATi video card owners saying that FarCry, once patched, is booting them back to the desktop in both mutli-player and single player games. I installed the patch myself on both an ATi 9800XT box and a GFFX 5950Ultra box and sure enough my ATi box crashes to the desktop randomly. I called tech support and was told this:

We have received an alarming number of complaints from ATi card owners that are experiencing crashes after installing the 1.2 patch. The issue is being looked into. At this time we suggest re-installing your game without the 1.2 patch.

ATI users are also getting IQ issues with the 1.2 patch like the reviewers found back when they were testing SM 3.0 performance.

As to the Shader Model 2.0b results their showing in that link, we'll have to wait till we get official word from some other sites on it.
 
gordon151 said:
Not really, the x800s are still generally quite a bit stronger in Farcry even with the SM3 pathway for the 6800. This is though ignoring the discovery that the 1.2 patch also has another SM2 pathway built into the game that seems to increase performance on the x800s by a good deal, as shown here.
Problems with this review:

a) It's in Russian, which I doubt any of us speak, and I don't trust Babelfish enough to get a clear understanding of their methodology.

b) The first graph seems to show an increase in the RV420 while running SM3.0, which it isn't technically capable of. Mistake one.

c) The second graph has the NV40 SM3.0 path labeled as "Enabled all features" - looks like they're comparing NV40 with all details on vs. R420 with details turned down. They ought to keep all settings the same between cards (this is the one aspect of [H]'s review system that I don't agree with). Mistake two.

c) They also don't mention which NV40 card is used - GT or Ultra. Mistake three.

I'm not an nVidia fanboy at all, I think both cards are fabulous technology, but I'm inclined to throw any results from this site the out window. If someone links to an English website benching with a more scientific approach, I'll listen.

EDIT: Why aren't all these magical rendering improvements mentioned in the patch notes? SM3.0 support is.
 
burningrave101 said:
In Far Cry the 6800u beats the PE almost every single time in the different levels at 1600x1200. The PE doesn't start winning till high AF is enabled and then its just a matter of what level you test and how you test it as to which is faster. They both win an equal amount of the time in Far Cry.

could you link me a review using the OFFICIAL 1.2 patch benchmarking ati 2.0b vs nv's 3.0
 
Flatland said:
Problems with this review:

a) It's in Russian, which I doubt any of us speak, and I don't trust Babelfish enough to get a clear understanding of their methodology.

b) The first graph seems to show an increase in the RV420 while running SM3.0, which it isn't technically capable of. Mistake one.

c) The second graph has the NV40 SM3.0 path labeled as "Enabled all features" - looks like they're comparing NV40 with all details on vs. R420 with details turned down. They ought to keep all settings the same between cards (this is the one aspect of [H]'s review system that I don't agree with). Mistake two.

c) They also don't mention which NV40 card is used - GT or Ultra. Mistake three.

I'm not an nVidia fanboy at all, I think both cards are fabulous technology, but I'm inclined to throw any results from this site the out window. If someone links to an English website benching with a more scientific approach, I'll listen.

EDIT: Why aren't all these magical rendering improvements mentioned in the patch notes? SM3.0 support is.

b) The title pretty much pointed out that they were talking about SM2b with respect to the x800 card. Labeling the actual graphs as SM3 was probably just a simplification.
c)The point is to compare the increases across the specific platform. It's not a performance comparison.

pino said:
just read the 2 readme file in farcry/support, and yes they mention geometry instancing in sm20b one.

from ShaderModel20Beta.rtf :

What features do you use from SM 2.0b in order to increase your rendering performance and what are the average benefits in percentage over SM 2.0?
Actually we use 2 main features with SM 2.0b:

1. Pixel/Vertex shaders 2.0b used mostly in indoor scenes (where we use per-pixel lighting heavily) which allow us to draw several light sources in a single pass.
2. Geometry instancing is used mostly in outdoor scenes for vegetation. We split all vegetation to groups and draw them using a optimized code path in a single draw-call.
So in the SM 2.0b path performance in both indoor and outdoor scenes is improved. With geometry instancing we have a performance increase of up to 30%, with SM 2.0b lighting - up to 40%.
The most significant part of SM 2.0b is that it allows us to increase the lighting and geometry complexity without big performance decreases. In the current generation engine (FarCry: CryENGINE) we optimized our indoor and outdoor scenes taking into account current available hardware (SM 2.0 compliant). As a result the number of dynamic light sources in indoor scenes are limited (depends on scene). In outdoor we use aggressive LOD factors (switching from 3D-models to sprites in the near distance) to reduce the number of calls. But with SM 2.0b it’s easily possible to increase the number of dynamic lights and using more dense forest without big impact to performance.
 
Back
Top