Will you buy a phsyx card when released in May?

Building gaming machines now days is really starting to get expensive, and i wont be buying one.
 
Why on earth would I? So the barrels in Unreal 2007 break up into more realistically moving pieces? Is that worth $300?

Until a spectacular new game comes out that relies on ultra-realistic physics to create a whole new plateau of immersion the idea of dedicated physics processors is a silly gimmick and a waste of money.

It will never be a mainstream commercial success unless a company like Id Software comes along to redefine the whole idea of computer gaming like they did with 3d accelerated shooters. If such an app does not come along then nobody (except the [H]ardest 1% of computer users like the ppl in this thread) will buy the hardware. If nobody has the hardware game devs won't design for it and, just like that, Ageia is dead.
 
I am very excited by the possibilites that the PPU brings to computers..

Will I buy one in May? Probably not, I will read some reviews, and I could see buying one in December if it looks like it would be worth it, but not right away in May.

==>Lazn
 
AceTKK said:
Why on earth would I? So the barrels in Unreal 2007 break up into more realistically moving pieces? Is that worth $300?

Until a spectacular new game comes out that relies on ultra-realistic physics to create a whole new plateau of immersion the idea of dedicated physics processors is a silly gimmick and a waste of money.

It will never be a mainstream commercial success unless a company like Id Software comes along to redefine the whole idea of computer gaming like they did with 3d accelerated shooters. If such an app does not come along then nobody (except the [H]ardest 1% of computer users like the ppl in this thread) will buy the hardware. If nobody has the hardware game devs won't design for it and, just like that, Ageia is dead.

They have already implemented their engine in many upcoming games and you will be using that engine whether you have a physics card or not. There might be some eye-candy stuff that gets turned off but the physics of the game will still be controlled by the PhysX engine, hardware accelerated or not. 3dmark06 is an example of a 3d app/game that uses the Ageia PhysX tech. Obviously you don’t need a PPU to run 3dmark06.
 
Yes, whenever they release a PCIe 4x model, and when more games support it.
 
I find it funny how people who spend hundreds upon hundreds of dollars to get a few extra frames balk at shelling $250 that will add some pretty cool stuff to games.
 
Oh, hang on, it's set to put my 3DMark06 score up. Looks like I'm buying one then :)
 
Eastcoasthandle said:
Remember, the whole purpose of physx is to bring a level of realism into gaming. Although I have to agree that Ageia must have a larger number of game titles under their belt I believe that will happen in due time.

However, if you had a choice of :
A. Ageia Physx Card
B. CF/SLI MB with 2 video cards that are going to simulate PPU chip

Which will you choose. Oh, it's easy to chose B when you already have a CF/SLI setup but how much better can it be if you added a Ageia Physx card? This remains to be seen. Before I take the plunge I need to see that card benched on 3DMark06 first. I want to know how much of a contribution this card makes to overall system performance. If all it does is generate the added FX that you would normally not see then it's really a waste of money.

I still do not find it necessary to buy a SLI/CF setup in order to create the same level of physx 1 card can do! It's inefficient, costly and ridicules IMHO. ATI/Nvidia plans to do the same appears to require you to buy SLI/CF at a premium which has already turned me off! Here is a link that explains more about it from Nvidia's POV.

I read it and they made some solid points (only if they used a GPU/PPU combo on the same PCB). Some of us have been guess for months now that ATI/Nvidia will make separate GPU/PPU on the same PCB but this does not appear to be the case now. What Nvidia appears to be doing is nothing more then a marketing gimmick to make you buy 2 video cards...nothing more nothing less. I say leave that aspect of gaming alone. Those that go SLI/CF do so out of luxury not necessity. Once you make SLI/CF a necessity then more will flock to the Ageia for $250 first before spending $500+ for a CF/SLI setup ($700+ if you don't have a SLI/CF board).

I think you have a good point. I am not impressed with ATI and Nvidia saying "well we can do that too, just buy an extra one of our cards!" Id rather buy a single high end card and a cheaper phsyics card.

Also, I will buy one if two superficial conditions are met.
1. It lights up.
2. I get to act like a snob to all the people who dont buy one.
J/k (kinda) :p
 
I really don't like the fac that they let Dell have it befoer every one else. Also, I don't like the idea of it only working on thier software. It needs to work on some set standard like DX and OGL.
 
GotNoRice said:
They have already implemented their engine in many upcoming games and you will be using that engine whether you have a physics card or not. There might be some eye-candy stuff that gets turned off but the physics of the game will still be controlled by the PhysX engine, hardware accelerated or not. 3dmark06 is an example of a 3d app/game that uses the Ageia PhysX tech. Obviously you don’t need a PPU to run 3dmark06.


I know, and that's the problem. If shelling out $300 only makes the game look a little better (hyperbolized by my barrel example above) then what's the point? The mainstream won't buy it. And since the mainstream won't buy it developers aren't going to support it in any truly meaningful capacity. They can't, because they can't invest the money and time into making a truly revolutionary physics-based gameplay experience if 99% of the audience can't run it. This technology needs a killer app, and they don't have one. Until they do there is no point in buying into the hype.
 
i wont buy it if its over $200 ,

and i definitely wont buy it within a year from the release ,
 
EA releasing a patch for PhysX support?!?!?! :rolleyes:

/start rant
Are you kidding? They don't even support widescreen monitor resolutions that have been popular for a couple years, let alone a brand-spanking new concept as physics-accelerating hardware. They don't even fix a lot of their own software bugs that have been in BF2 forever.
/end rant

Don't get me started! :)
 
besides.. what money is there to be made from spending money on patching games that people already bought...
 
I like the idea. That way the CPU can be used more and more for AI, and the GPU can concentrate on what it was made for.. rendering graphics. I know that people on 24" and higher displays like having all of the Graphics power they can, and I for one, want the best images possible, while having the most realistic combat, both, not one or the other.. I cant wait until the first RTS set in a modern timeline is released with PhysX (Rise of Legends is looking nice as well.)
 
I think i'd prefer a separate PPU to having one of my two GPUs act as a physics accelerator. I don't want to sacrifice the graphics performance i have right now in order to get better physics in games. I'm glad that you'll be able to choose to turn off the physics acceleration in favor of graphics power in nvidia's implementation, and that's what i'll probably do. However, i do still want boosted physics in the games that support it, and hopefully i'll be interested in the games that support Ageia's method. Perhaps HavokFX and Ageia will become two rivals in the same vein as DirectX and OpenGL.
 
I'm interested in it provided a few things

1. Has to hit below 200 bucks. I'm a casual gamer and don't really see the point in spending more then that.
2. Enough games need to have support for it. How many? Hard to say but figure if 3 or 4 games I'm interested in support it well and I know a few more are on the way I'd be interested.
3. Comes down to the game developers more but the copy protection on games is making me really mad. Steam sucks and I got real angry when I couldn't play css because there server was too busy to let me play on my clan server that I pay for. I also loved the splinter cell games but will not buy another one for my computer after having a bunch of problems burning cds/dvds on my notebook because of the copy protection they installed that fucks everything up. At this rate console games are looking better and better. It really burns me that I have to jump through a bunch of hoops when a pirate doesn't need to. That brings me to another point.
4. Needs to work pretty much out of the box. You know how sli had some weird issues when it came out? I don't really want to mess with them for an addon card.

One of my big fears is that ms will make there own standard that will render the card worthless. If not ms then ati and nvidia will make there own standards that are not compatable. I mean remember back in the day you had glide and then some other companies 3d stuff that was cool but worked on like 5 games total. The best option would be phsyx being licenced by ms and others making it a standard or having a standard built into direct x that most people will use.

Another fear is that since there is pretty much no option for notebooks which are talking more and more of the market limit the spread of this.

With me I will be waiting a little while but I am looking forward to getting one.
 
No.

The gaming industry is all about reducing cost. When workstation cards cost too much, card makers took advantage of reduced ram prices to ship simplified 3D rendering chipsets. The chipsets were short a few features from their professional counterparts, and relied on the CPU to do transform and lighting, but they were affordable.

Over time, the feature set of the cards has grown to match, then exceed that of your typical professional workstation card of the mid 90s, all because it became affordable. 32-bit color, T&L on-chip, anti-aliasing, anistropic filtering - early 3D chipsets couldn't handle such thing. Now we have features even professional cards of the era didn't have: pixel shaders, vertex shaders, multiple monitors, assited video decoding, and massive amounts of horsepower and memory bandwidth.

This Ageia processor is the first step in bringing physics processing out of the professional world and into the consumer world...but it's not the same as when 3DFX released their Voodoo Graphics, PowerVR released the PCX1, and Rendition released the Verite. Hardware 3D graphics were flashy, and the hardware assistance was many times better than software...but hardware-assisted physics, especially when it is not critical to gameplay, doesn't make such a big bang.

Not only is it harder to justify the step from CPU-driven physics to a dedicated processor, but the video card market is MUCH different. Now, instead of dozens of video chipset players (like at the start of the 3D graphics revolution), there are only a handful, and the top 3 hold the vast majority of the industry.

The graphics card makers know the gaming industry is all about reducing costs, and yet another perhiperal means one more card to budget into a system, which could mean less money spent on graphics cards. Thus, we have Nvidia's announcement last week, to fuel uncertainty on Ageia's separate card solution.

In the long run, the gaming industry is all about cost, and the key to reducing that cost has always been integration with the graphics chip. Nvidia's current physics support is a stop-gap, but I fully expect them to build much more thorough support into future chipsets. ATI will do the same. Expect to have your single-chip GPU + Physics engine in a year or two. In the mean time, people with cash to blow can buy an Ageia card, thus paving the way for the rest of us.
 
i'll buy one in 3 years

upgrade to AM2 and hopefully an even better version of CS will be out :)
 
Ati claims its offering is superioir. they say they can dedicate a card for physics even if its not in crossfire. you could use ati's technology even in an sli mobo or something like the asrock dual sata 2 that happens to already have an additional video slot. I dont know about you but i like the idea of buying a cheap budget ati card or using one that you have previously owned instead of buying a 300 dollar card. te x1600xt is only !150 dollars and it pretty much doubles the physx card in terms of ram and core clock etc
 
^ ATI is just trying to steal some of Ageia's and nVidia's thunder. ;)

I'll be buying an Ageia card fo' sho' when they are released. :)
 
magoo said:
Ok. Ageia has the physx, currently with a handful of games. Are one or two games worht that extra investment???? I think not.
nVidia and ATI will most likely get on board with havokFX which currently is present in a multitude of game titles (not the FX, but havok physics engine) and it costs me nothing extra to have the graphics card produce physics effects.....if I want them......

I think I'll pass.

I don't see why you say that...remember, Half Life 2 and Doom 3 were the cause of people buying new video cards altogether.

BTW: I was reading on ATi's implementation of physics last night... looks to be a LOT better than nvidia's. I can see buying an X1600 to stick in the second PCIE slot just for the physics purpose.
 
Bling said:
Ati claims its offering is superioir. they say they can dedicate a card for physics even if its not in crossfire. you could use ati's technology even in an sli mobo or something like the asrock dual sata 2 that happens to already have an additional video slot. I dont know about you but i like the idea of buying a cheap budget ati card or using one that you have previously owned instead of buying a 300 dollar card. te x1600xt is only !150 dollars and it pretty much doubles the physx card in terms of ram and core clock etc

Thats pretty cool, that way when you upgrade, you can put your old card to use.
 
Bling said:
Ati claims its offering is superioir. they say they can dedicate a card for physics even if its not in crossfire.
It would be awesome if any ATI SM2 card could be dedicated to physics, like a cheap PCI-E X300 or X600. :p
 
pxc said:
It would be awesome if any ATI SM2 card could be dedicated to physics, like a cheap PCI-E X300 or X600. :p

Any ATI XM 3.0 GPU can do it, X1300 etc....

This is possible:

1st Card ) An X1900 XTX doing your 3D

2nd Card ) Have an X1300 aiding in physics
 
I don't think so. Maybe later when games start to support it.
 
The only bad thing i can think of about the Ageia card other then price is online experience. It wont really add much it seems because i find it hard to believe that my internet connection can handle telling 16 other people that i have 10,000 peices of garbage just spinning around my belly. Maybe it will work with subtle stuff like some smoke and explosions but when your talking about 300 boxes 2000 paint buckets a few huge pipes and then the 12 other people on the server your going to really tax that internet connection.
 
Brent_Justice said:
Any ATI XM 3.0 GPU can do it, X1300 etc....

This is possible:

1st Card ) An X1900 XTX doing your 3D

2nd Card ) Have an X1300 aiding in physics
or a 7800GT and an X1300 :) I don't want to get an ATi card just to do that ;)
 
I vote that this thread should have had a poll too. As far as purchasing one, hellz no, wtf is it gonna be good for? I don't have money to waste on a useless piece of hardware. I'll wait till it goes mainstream and there are actualyl games that support it and use it. Ha maybe at the rate vista is going I'll need it by then. :p
 
HELL NO! And I'm dissapointed H is has given in and made a freakin sub-forum for this crap.
 
I will be buying it for Unreal 07, Vanguard & to future proof my new machine.

I am just kinda pissed that it is not PCIe (just like the X-fi which also pissed me off). Also, now I have to deal with another fan & card that needs a power connection.

Doesnt PCIe supply more power to cards than PCI??
 
mike_j_johnson said:
I will be buying it for Unreal 07, Vanguard & to future proof my new machine.

I am just kinda pissed that it is not PCIe (just like the X-fi which also pissed me off). Also, now I have to deal with another fan & card that needs a power connection.

Doesnt PCIe supply more power to cards than PCI??

no such thing as "future proof"
 
Back
Top