PhysX - the Betamax of computing

heycarnut

n00b
Joined
May 5, 2006
Messages
6
Took a long look at this with graw with a buddy at a PC mag lab.

After the initial 'cool idea' wore off, I thought about the viability of this product (and company) in the long run. My conclusion: DOA, for all intents. This will go the same way as sound cards, only in the blink of an eye in temporal terms. Just look at the trend in modern game engines to do sound 'rendering' in software instead of requiring hardware for effects, and look at the soundcard market - it's dying on the vine.

My reasons for thinking that the PhysX card is in the same boat with a bigger hole in the bottom are:

1) CPU/GPU processing power are increasing at an incredible rate. With multi-core/card systems becoming more commonplace, there will be plenty of free cycles to do processing such as sound and physics. Chip design progress is fueled by large competitive companies in the GPU world. We'll likely see the same PPU core chip on the physics board for quite some time, while CPU and GPU cores move further and further up the performance curve.

2) Proprietary hardware (and arguably software.) Almost always a nail or two in the coffin. Software and APIs already exist to do general purpose computing with the GPU, such as Brook, etc., and have been available for a few years. Havok, or any other physics engine builder, can use these tools to dramatically enhance the performance and capabilities of their products using commoditized hardware (and software). And if an engine builder is intelligent (most are), the physics hooks in the engine will abstract the underlying architecture, allowing, say, Unreal to switch from proprietary solutions like PhysX to open solutions like GPU physics very quickly and at minimal expense. If you can't offer an extremely high value of features, functionality, and performance compared to other solutions, and you are marketing a proprietary, niche product, your product is doomed.

3) $300US for slower framerates? Enough said on this already.

Just my opinions, of course, your thoughts?
 
I agree. I think Havok is on the right track with wanting to use extra cpu cores now that multicore is the way the industry is going and even possibly unused gpu cycles. It makes more sense at this stage than expensive niche hardware. Physics is something just starting to be played with and jumping straight into things like seperate PPUs is overkill and pointless. Developers are not likely to spend a lot of time writing special code to interface with proprietary hardware and api's for something with such a miniscule installed user base and almost none are going to be dumb enough to make it required. The few developers who do bother with it will probably be more of a gimick ala GRAW than anything worthwhile and then only because Aigia is putting money into the projects to make a few sales and justify thier existance.
 
I don't think physiscs in a game can justify a 300 dollar price tag...For many of us, even enthusiasts, that's our limit price for graphics cards, which is a 'requirement'. The Physx card, while it looks cool and all in what it can do and potentially do is not an requirement.
 
Nomikal said:
I don't think physiscs in a game can justify a 300 dollar price tag...For many of us, even enthusiasts, that's our limit price for graphics cards, which is a 'requirement'. The Physx card, while it looks cool and all in what it can do and potentially do is not an requirement.

i very much agree
 
bear in mind that Agia's middle ware can already use multi core CPUs just fine

mostlikely the PhysX as it is now can and WILL support MS physics API when its out

its new look at the price of the Voodoo1 when it came out it will come down give it time

as to the slower frame rates well the more stuff on screen the bigger vid card you need

if you got the money for a PPU you should have the money for beefy vidcard
and remeber what vid card will do once this takes off will be much faster then now
if you look at games like crysis grapics cant get visualy much better its the interactivey that will the PPU + SM4 + the new Geomatry sharder = in game objects that act like thay should
like in the crysis demo vid

it will just take time i think that the new splinter Cell and Unreal3 will be the "quake" "Tombraider" and "Uneal" of the PPU thoughs of you that have been around for wile will know what i mean
 
your average joe somebody isnt going to spend $300 on a graphics card, let alone a physx car. no way itll become standard for games. this is an elite product.
 
pincho said:
your average joe somebody isnt going to spend $300 on a graphics card, let alone a physx car. no way itll become standard for games. this is an elite product.


joe somebody didnt buy 500 buck Voodoo 1 and 100 buck sound cards ether ;) give it time
 
i guess, but theyre still not buying those $500 gfx cards and $100 sound cards, so we'll wait and see
 
pincho said:
i guess, but theyre still not buying those $500 gfx cards and $100 sound cards, so we'll wait and see

I think the point is that those more expensive models eventually work their way down the foodchain to joe-sixpack. Joe is never going to have the top of the line Geforce 7900GTX, nor will he have a hybrid car with power windows, but that doesn't mean the features in thoe models won't eventually be standard. At the same time there's always a top of the line that the average person doesn't have even though he might have what was once luxury eventually.

Also on another note I'm a bit tired of endlessly comparing every single new technology to Betamax. It's almost as tired as the old Nazi accusation constantly levied by everyone online.
 
I'm the opposite, i'm a skeptic that simmered down to critic (spellcheck).

Your analogy doesn't seem right, Beta is superior to VHS but licensing did them in, same with the mac situation.

In the console, Havok and Ageia are both running in software mode, so there's really no difference there.

as to the slower frame rates well the more stuff on screen the bigger vid card you need

if you got the money for a PPU you should have the money for beefy vidcard
and remeber what vid card will do once this takes off will be much faster then now

A very small number of gaming rigs have video cards close to the one used for the cell factor video (or even GRAW), among those few, even fewer will buy a PPU. Can Ageia survive in a niche market *within* a niche market?
 
I did not mean by analogy that Beta was an inferior technology, nor that the PhysX idea of using hardware to accelerate physics is lacking, just that PhysX is DOA in the same way the Beta was, the latter becuase of many factors including those you mentioned.
I still used an old pro beta deck until recently...

I think that the idea, however, implemented as it is, is too late. It will be far too easy to replicate, and likely exceed, what the simple board does now with GP-GPU capabilities. Just look at what is already happening to the high-end vector/math accelerator market: these are already being supplanted by GP-GPU solutions, and this is a market with deep pockets and a need for the highest performance.

I'm thinking of setting up a 'web-bet', if it's legal, and gather $ from believers where I'll wager that the company/product will be dead within 24 months, with an API/Engine buyout at best. Could be fun, and profitable. :D

R
 
Why would the API fail, just out of interest?

It's free, MS has decided to roll it out for the 360 and PS3 has adopted it. It also utilises multiple cores and offers more features than Havok's vanilla API does, such as software cloth and so on. How can you claim straight out that Havok's "better"? Without any comparison?

I don't see the API dying anytime soon, and as long as its around the hardware has a reasonable chance.
 
Betamax failed because Sony wanted unreasanable liscencing costs to use the format, while VHS was free. Ageia doesn't make you pay for their Physx SDK or API.

Make analogies better.
 
Nomikal said:
I don't think physiscs in a game can justify a 300 dollar price tag...For many of us, even enthusiasts, that's our limit price for graphics cards, which is a 'requirement'. The Physx card, while it looks cool and all in what it can do and potentially do is not an requirement.

With SLI and Quad SLI the price spent on high end gaming machines video processing hardware alone way exceeds 300 dollars.

If you don't want to spend that much money on the product now, then don't, I will be buying some kind of physics hardware in the coming months whether that be a PPU or another video card for an SLI rig which has the potential to be used as a dedicated physics processor.

I think some of you underestimate their business model here, they're giving away the actual physics engine liecens to developers, which would usualy cost a LOT otherwise, even if games just use the engine for CPU driven physics thats a start.

I'm helping develop a Reality Engine title and we're already using their physcs engine, we're looking at adding extended effects for people with PPU cards, lots of things are possible that are simply effects but which greatly increase the graphical effects.

Of course we're going to have all the enthusiast gamers scrambling for games with PPU only content in them, and if we're one of a few games using it then we have a fairly unique selling point. Plus we've already saved money on not having to spend on a physics engine licence we can put that money towards other things.

If you expect Ageia to flop simply because the hardware is expensive to start with then think again, most things do start off expensive, when the price settles down and the technology improves and matures then I think we'll see the market for it boom.
 
heycarnut said:
3) $300US for slower framerates? Enough said on this already.
I'm sorry, but this point is just stupid IMHO. If you put more on the screen and turn up the visual quality of a game, what do you expect to happen? Its not the PPU's fault, its your GPU trying to keep up with all those new things its being asked to draw.
 
Unknown-One said:
I'm sorry, but this point is just stupid IMHO. If you put more on the screen and turn up the visual quality of a game, what do you expect to happen? Its not the PPU's fault, its your GPU trying to keep up with all those new things its being asked to draw.

A very small number of gaming rigs have video cards close to the one used for the cell factor video (or even GRAW), among those few, even fewer will buy a PPU. Can Ageia survive in a niche market *within* a niche market?
 
I think it's too early to judge the PhysX card. After all it just came out: and a brand new technology to boot. Hell, there's barely any software that can take advantage of it yet. One thing is for sure though: the PhysX card is way ahead of its time. Physics are very primitive in games so far, so at least they're pushing for advancement - even if they ultimately flop.

In the past, I remember people saying that the increase in processor speeds would ultimately make dedicated videocards obsolete. They were either not-too-bright, or the future just didn't pan out as they thought. Perhaps CPU speeds (and increases in cores) would render the PPU unnecessary - though I guess this depends on if general purpose CPUs are suited for large amounts of physics calculations. If not, then perhaps Ageia just pioneered something that would have been necessary in the long run anyway.

I fail to see why people are complaining about the $300 price tag. The people who would be interested in buying this technology now are early adoptors and the hardcore who have already spent many hundreds of dollars on a top-notch video subsystem. Needless to say, the PhysX isn't aimed at bargain shoppers. That said, if the card becomes more mainstream, I have no doubt that it will become less expensive (or they'll release a version that is).


This will go the same way as sound cards, only in the blink of an eye in temporal terms. Just look at the trend in modern game engines to do sound 'rendering' in software instead of requiring hardware for effects, and look at the soundcard market - it's dying on the vine.
I don't really follow your soundcard analogy. Are you talking about the increasing usage of motherboard-integrated sound? Lots of people still use dedicated sound cards for some reason or another (hardware acceleration, better sound quality, more features, niche product), so soundcards are hardly dead.
 
Sly said:
A very small number of gaming rigs have video cards close to the one used for the cell factor video (or even GRAW)...
And in that respect, PhysX cards are ahead of their time. Though, who's to say that we wont be able to turn off eyecandy physics and only use gameplay physics to save our video cards from chokeing? So what if I'm unable to actualy render the awsome water that the card is calculating, as long as the ball I drop in it acts as it should because the inernal calculations ARE there?

In that case, its no harder for the videocard than it has been with shader water and your average polycount sphere, the PhysX card is just more accuratly telling the video card where to render the sphere.
 
The reason PPUs are totally incomparable to GPUs is because of the utter inscalability of physics programming. Unless there's something I'm missing, they're going to have to program the physics in games on an all or nothing basis, i.e., either the game uses the enhanced physics or it doesn't. You can run HL2 on everything from a TNT2 Riva to a 7900 GTX at a wide array of resolutions and quality levels, but how do you program the physics in a game to run like that? If you can somehow scale your physics equations to run on a wide range of physics processors, if the PPU market somehow proliferates to the level that the GPU market has, that's basically an acknowledgement that the physics are superfluous and gimmicky, and not the "experience enhancement" that they're marketing it as now. It's a losing battle, IMO.
 
heycarnut said:
I'm thinking of setting up a 'web-bet', if it's legal, and gather $ from believers where I'll wager that the company/product will be dead within 24 months, with an API/Engine buyout at best. Could be fun, and profitable. :D

R

I bet you will have a PPU installed in your rig within the next 24 months hoping nobody remembers you used to be one of the anti-PPU party-poopers!!

:p
 
Well you can probaly makes the equations simplier, and at the same time more vauge. Or perhaps turn off some of the phyics features. Like perhaps water ripples.
 
Unknown-One said:
And in that respect, PhysX cards are ahead of their time. Though, who's to say that we wont be able to turn off eyecandy physics and only use gameplay physics to save our video cards from chokeing? So what if I'm unable to actualy render the awsome water that the card is calculating, as long as the ball I drop in it acts as it should because the inernal calculations ARE there?

In that case, its no harder for the videocard than it has been with shader water and your average polycount sphere, the PhysX card is just more accuratly telling the video card where to render the sphere.

That would be akin to buying a PPU+5600 instead of a 7900 for your gaming rig.
 
While up coming hardware, such as quad core CPU's, and better upcoming GPU's may be more than capable of doing the same things as the PhysX cards now, but lets face it, we're still going to want the maximum output from that hardware, just like we do w/our present systems. We're still going to overclock and such. Also realize, games will be much more demanding of that very same hardware too, more than ever before, outside of the physics of graphics.

We have built in sound now, yet a lot of people still buy external soundcards. Why, to take the load off of the CPU's. Plus the external cards are better than the built in ones. Granted, Nvidia did have a good built in sound system once, but that's gone now.

As for not buying the PhysX cards, because of cost, and not really needing it, well do we really need SLI? People are buying second video cards aren't they? We're buying dual and multicore processors, as well as 32/64 bit processors, when very little software uses them, yet.

Besides, either way, w/Ageia, you'll have to buy the PhysX card. With Havok, you'll need a second video card. One video card won't be able to handle it. Then when using a SLI system in a game, that'll be affected w/some cycles being taken away for use in processing the physics.
 
GoHack said:
Besides, either way, w/Ageia, you'll have to buy the PhysX card. With Havok, you'll need a second video card.

PPU = Paperweight
Extra GPU = Graphics Overkill

Which to choose...
 
Sly said:
PPU = Paperweight
Extra GPU = Graphics Overkill

Which to choose...


maybe for you but ill take both if i had the cash
 
I see them tanking within a few years, same with those X-Fi 64bit chips.
 
Will they flop? Doubtful. But will they thrive? That's the real question.

To give you an idea of where I see the physics processor going, here's how the situation plays out:

Most people are likening the PPU to the GPU, which is a bad comparison. When programming physics for a game, you only have to decide how MUCH you will calculate, not actually *how* you will calculate it. When programming graphics, developers are constantly coming up with new ways to simulate a given effect on a graphics card. Physics equations are simple, the only reason Aegia's PPU can accelerate them is because of its incredibly parallel architecture. GPU development typically works towards two ends; first, produce higher framerates with existing graphics engines. Second, improve the hardware capabilities of the GPU itself so that new effects can be created. Older GPU's won't be able to produce this effect because it is in and of itself unique. To give you an example of how that relates to physics, take air resistance. Most games physics engines ignore air resistance, not because they haven't figured out a way to accurately calculate it(such as is this case with graphics), but because they can't do it *and* fit in calculating the rest of the game.

Given that, at what point will we have exceeded our needs for physics processing? We're nowhere near that point with graphics, how many years of development has it been since the first 3d accelerator cards were out? However, we're very close with revision 1 of the PPU(at least in single player). If Aegia does end up releasing the cards similar to how high-end soundcards are released(not something you need to upgrade once every 6 or 10 months), the price likely won't come down much. If you're not selling as much of your product, you have to sell it for more. I wouldn't expect to see the card available for less than $149 while it's still the "current" PhysX card.
 
Back
Top