Anyone see the stalker benchmarks

Status
Not open for further replies.
Yea i just checked out that article. Is that a russian website? Their english is a bit funny at times but I guess the numbers are what is important.

I wish they would have tested with a 9700 pro as well. I'd be interested to see just where the 9700pro fits in with all the new cards.
 
Yeah, those scores were reassuring. Good to see that a developer can partner with one IHV and yet keep performance up across the board for all major players.
 
For a game that is meant for nVidia hardware, ATi sure handed their ass to them...BUT nVidia did do well.
 
Griffen, from what I've found, the Radeon 9700 (non-pro) is close in performance to the Geforce4 Ti4600 in older games, with basic settings, but it readily outperforms it in DirectX 9 titles, and also shows it's strength when you turn up the image quality settings. Also, the 9700 Pro is clocked about the same as the 9800 (non-pro) with just a little more memory bandwidth as the 9800, but missing some minor hardware optimizations of the 9800 line. I don't suppose that helps?
 
Thats interesting how on all resolutions below 1600x1200 without aa or af the 5700 ultra beats the 9600xt pretty badly but then when the AA and AF are enabled the 9600xt pulls ahead by a moderate margin. On 1600x1200 with with aa and af though the 5700 ultra is able to maintain the lead in both settings.
 
ouch... looks like Nvidia has some more *cough* optimizing to do.

how and the world will they ever make up the frame rates without compromising IQ?
 
What I find interesting is that the 5600 played fine. Maybe everyone should go buy a 9800 series card before any other Direct x9 games come out, sheesh.

And about Nvidia's software/optimizing and whatever else bullshit people spout off: per article that was referenced:

"We still remember about ATI RADEON 8500’s poor performance in Unreal Tournament 2003 benchmark before release and rather fast speed shown by this graphics card after the official launch of the game. In case of titles developed by professional teams, such as Half-Life 2, we can be almost sure that they will run well on all hardware that is available in the market."
 
ouch.............

Love the upper-end ATI numbers. By the time the game comes out, my custom rig will be done and all will be right with my online world.....thats fer sure. :D
 
i really feel sorry for xgi i wanted to like their cards because of the dual GPU (just something i like) but no poor drivers = the suck.
 
Originally posted by creedAMD
ouch... looks like Nvidia has some more *cough* optimizing to do.

how and the world will they ever make up the frame rates without compromising IQ?

So from alpha benchmarks without any ingame iq testing you've already came to this conclusion? How big of a fanboy are you?
 
Originally posted by WalteRr
So from alpha benchmarks without any ingame iq testing you've already came to this conclusion? How big of a fanboy are you?

Many insecure people need to re-affirm their faith in their purchases on a daily basis. It is a shame really because it dashes all hopes of having an intelligent conversation in this section.
 
For a game that is meant for nVidia hardware, ATi sure handed their ass to them...BUT nVidia did do well.

And yet again all the nvidia boys start crying when they see there legend 5900nu dx8.1 cards getting there asses spanked by proper dx9 cards:p
Guess whos gotta upgrade again only two months after buying there cards;)
Gotta love ati fanboy or not lmao:p
 
Originally posted by @trapine
And yet again all the nvidia boys start crying when they see there legend 5900nu dx8.1 cards getting there asses spanked by proper dx9 cards:p
Guess whos gotta upgrade again only two months after buying there cards;)
Gotta love ati fanboy or not lmao:p

Wow, you proved my previous post with supreme efficiency. You must be proud of your idolatries. :rolleyes:
 
Originally posted by Roost426
So having a grudge will keep you from buying the best hardware?

"best hardware" is relative to me.
regardless of ATi's ability to give me 10 more FPS over nvidia at no aa and no af and a 20 more fps at 4x aa and 4x af (not a real world representation) i could care less about those 20 frames more..and dont give me shit bout "superior image quality" ive seen a 9800XT and a 5950 in action..and to be honest to me they were pretty much the damn same....i dont stand around in a game long enough to notice the ripples around the asshole of a gibbed player..i dont care as long as i can play...shit ive got a GF2 PRO that is being used....so "best" is relative to who your talking to....and YES me having a grudge will stop me from buying from them....even if they threw in a free blow job with that HL2 coupon...i still wouldnt buy it
 
Originally posted by DocFaustus
Many insecure people need to re-affirm their faith in their purchases on a daily basis. It is a shame really because it dashes all hopes of having an intelligent conversation in this section.

It's getting to be soo pointless. Every thread in vc has an ATi fanboy shouting like a moron then a defensive nvidia fan shouting back and the threads get derailed and eventually closed. I think we shold be discussing how xbit labs is benchmarking on a stolen piece of alpha software that doesn't have half the features the real game is suppose to.

Actual dynamic lighting, if you run the alpha you'll notice everything is lightmap and the characters cast a blob shadow.

The real weapon sounds and other models, all the sounds currently are "borrowed" from counter-strike. And the AI is basically half of what it should be.
 
well since cs sounds are all available through online sound shops they could have bought them LOL.
 
As with Doom 3 and HL2, the STALKER benchmark is worthless. The game is not out yet.

The last time I checked on STALKER, it's release prediction is for a May-June '04 timeframe. Which, realistically, probably means Xmas '04.
 
Funny how ATI fanboys take the piss, yet several days later after release of such games we get forums full of threads starting:

"HELP MY RADEON WONT PLAY STALKER!!!!111oneone"
"Which hacked/leaked/beta/version of ati drivers work with STALKER?!?!"
"IM HAVING MAJOR PROBLEMS WITH STALKER HELP PLZ!!!11"
"RADEON PROBLEMS NEED HELP"

etc
etc
etc

Some people will, however learn that its not just about numbers, its about experience, a direction that [H] themselfs are heading in.
 
Originally posted by Princess_Frosty
Funny how ATI fanboys take the piss, yet several days later after release of such games we get forums full of threads starting:

"HELP MY RADEON WONT PLAY STALKER!!!!111oneone"
"Which hacked/leaked/beta/version of ati drivers work with STALKER?!?!"
"IM HAVING MAJOR PROBLEMS WITH STALKER HELP PLZ!!!11"
"RADEON PROBLEMS NEED HELP"

etc
etc
etc

Some people will, however learn that its not just about numbers, its about experience, a direction that [H] themselfs are heading in.

Who are you to call someone else a fanboy?:rolleyes:
 
I maybe a fanboy of Nvidia but i dont let that impare my judgement. Im still awear that ATI run the latest games faster, but thats hardly the end of the story, i am however not going to be drawn into an argument here.

At the end of the day STALKER looks awesome and the whole idea is brilliant, for my tastes (pretty broad), not only that but it doesnt run like a steaming pile of crap (on any mainstream cards) Couldnt ask for more in a game, this one has definatly shunted HL2 to one side for me and a lot of my friends, not that HL2 needed much shunting that is.
 
Of course every card runs it good, no big effects are on, until they get benchmarks of something that isn't a non-finished leak that has every graphical feature in that will be in the game please notify me
 
I just hope this game plays well, and is fun. Unlike other games that get hyped up, screenshots look incredible, developers release benchmarks to keep you salvitating, etc. ; but when I finally lay down my hard earned dough - the damn thing plays like poo.
The benchmarks are just another teaser that really don't mean much to me, because you can't even play it yet. :mad: I've been waiting a long time for this game.:(
 
me sees this one deadending soon.

Anyway, I am looking forward to this game no matter what card is "supposed" to be the "best" one to run it with.

Why the quotation marks? Cause IT IS ALL SUBJECTIVE!:eek:
Come on people, EVERYONE has a different set of eyes. Meaning EVERYONE has a different idea of what looks good to them.

While I use a 9500, and love it with every game I play, my brother uses a GeForce2, and he loves it for every game he plays, and he plays many the same as I.

There are so many varibles as to what makes a game look good to one person to the next, that it is useless and pathetic to get into an argument over what card someone else should own. Might as well pull out the rulers and start measuring.:rolleyes:
 
it is useless and pathetic to get into an argument over what card someone else should own. Might as well pull out the rulers and start measuring.:rolleyes:

Haha - need that for the sig.:p
 
Why do people still believe in these "the way its meant to be played" and the ATI counterpart???

I think that people should have noticed by now that that doesn´t mean anything in 90 % of the cases. I don´t know how many nvidia merchandised games that runs way better on ATI hardware. Not so many the other way of course since I am on an ATI as it´s but I am sure there is many examples of that too.

After all 95 % of the income the publishers/game developers get is from gamers not from the video card makers.
 
Geez...

Well it's nice to know I may be able to run this game at a high IQ and resolution at a playable framerate. :)
 
Originally posted by oqvist
Why do people still believe in these "the way its meant to be played" and the ATI counterpart???

I think that people should have noticed by now that that doesn´t mean anything in 90 % of the cases. I don´t know how many nvidia merchandised games that runs way better on ATI hardware. Not so many the other way of course since I am on an ATI as it´s but I am sure there is many examples of that too.

After all 95 % of the income the publishers/game developers get is from gamers not from the video card makers.

TWIMTBP is literally nvidia sitting down with developers and helping them. All they ask for is a logo at startup or on the box. It's not like they're forcing every company to use Cg and remove compatibility for ATi.
 
Originally posted by Princess_Frosty
Funny how ATI fanboys take the piss, yet several days later after release of such games we get forums full of threads starting:

"HELP MY RADEON WONT PLAY STALKER!!!!111oneone"
"Which hacked/leaked/beta/version of ati drivers work with STALKER?!?!"
"IM HAVING MAJOR PROBLEMS WITH STALKER HELP PLZ!!!11"
"RADEON PROBLEMS NEED HELP"

etc
etc
etc

Some people will, however learn that its not just about numbers, its about experience, a direction that [H] themselfs are heading in.

And we have the same effect with Windows...right now ATi is pretty popular. Therefore, since more people (with higher end machines) own ATi cards, there is a greater chance that someone will have problems with it. Just because computer A,B,C,D works with the card, doesn't mean computer E will. And I also see enough of people with nvidia problems too on the [H] forum. ATi has their share of bugs, but believe it or not, nvidia probably has quite a few problems/bugs too.

I maybe a fanboy of Nvidia but i dont let that impare my judgement. Im still awear that ATI run the latest games faster, but thats hardly the end of the story, i am however not going to be drawn into an argument here.

That's the biggest bull I have ever heard. I am sorry my friend you ARE (not maybe) a nvidia fanboy (as we can tell from many threads), and your judgement will always be impaired. It's that simple, you cannot control that kind of thing (again we can tell from many threads...).

Bottom line is this, there is no reason to throw "fits" over which card company is better for anything thats leaked, you will never know the "true" outcome till the actual game is in stores and is able to buy.

Having said that, I still was very disappointed that Volari did so bad. How does everyone else feel about it? Its drives are a big let down imo. Oh well...

-MoOfAsA~
 
Not sure if this has been mentioned (didn't see it on the boards here yet) but the Stalker benchmarks from Xbit Labs are COMPLETELY bogus......they mentioned in the review it uses directX9 BUT did they bench it with DX9?

See link here for info....
http://oblivion-lost.xu2.net/en/index.php


Kared
 
Originally posted by Princess_Frosty
I maybe a fanboy of Nvidia but i dont let that impare my judgement. Im still awear that ATI run the latest games faster, but thats hardly the end of the story, i am however not going to be drawn into an argument here.

Seconded. I did no optimizing with my crappy TNT2 Ultra, and got frame rates comparable to what I'm getting with my optimized-out-the-ass BBA Radeon 9600 Pro. Its not a CPU bottleneck either, I have a 2100+.

NVidia made a huge PR mistake with their "optimization" scheme, but I think in the long run they will win the public back. However, thats not helped by fanboyism, from either camp.

On topic again, I'm happy that any of the cards are being spanked in the performance modes (i.e. 10X7 no AA no AF), as thats how I usually play. STALKER has been my favorite among all of the games coming out since the big HL2 let down of September 2003.
 
I see Frosty upset that his card isnt doing well again :(

FYI, go read the nvnews forums. Every card has some problems in games. Time to retire the old "ATi has bad drivers" argument. Its as silly as saying AMD's run hotter than the sun. ATi's drivers are great now, and AMD's run much cooler.

But I guess when you have nothing else to hold onto...
 
DocFaustus
[H]ard|Gawd

Registered: Sep 2002
Location: Martinez, Ca
Posts: 1098




Wow, you proved my previous post with supreme efficiency. You must be proud of your idolatries

Wow your parents must be proud you can spell idolatries.Onya champ:p :D
 
Originally posted by Kared
Not sure if this has been mentioned (didn't see it on the boards here yet) but the Stalker benchmarks from Xbit Labs are COMPLETELY bogus......they mentioned in the review it uses directX9 BUT did they bench it with DX9?

See link here for info....
http://oblivion-lost.xu2.net/en/index.php


Kared

Like I posted. Thanks for the link.
 
If it runs better on ATI hardware NOT using dx 9 I guess nVidia is quite pissed on the STALKER developers right now.

I mean if nVidia sits down with them trying to optimize their game for nVidia hardware and fails so miserable??

But I can understand why nVidia need to hold the game developers hand. Considering how the HL 2 people spend 5 times longer optimizing for nVidia hardware then ATI hardware.

That says about all who the game developers actually are trying to serve. The gamers not the video card companies so HL 2 is a ATI sponsored game and is bundled with every 9800 out there.

If just people could forget about this merchandised crap. The first too nVidia "optimized games" I had RSC and UT 2003 both run crap on my ti-4600 on release. Both was fixed when getting my non optimized 9700 PRO.

And to be honest I haven´t find an ATI sponsored game that runs significantly better than nVidia. The difference in performance is the same wether they are in those add campaigns or not.

If this at least would help us get bug free games for any specific hardware it may have been useful but we all know that isn´t the case.
 
Originally posted by @trapine
And yet again all the nvidia boys start crying when they see there legend 5900nu dx8.1 cards getting there asses spanked by proper dx9 cards:p
Guess whos gotta upgrade again only two months after buying there cards;)
Gotta love ati fanboy or not lmao:p
Looks like the 5900NU once again owned the 9600XT. Maybe it's the 9600XT owners that are needing the upgrade. This is exactly the same results I would have predicted. The 9800 pro wins, but the 5900NU still shows it's a great value for the buck. Absolutely destroying the similar priced 9600XT in every test by 25-50%.
 
Status
Not open for further replies.
Back
Top