Nice readup:
http://techreport.com/articles.x/17815/1
http://techreport.com/articles.x/17815/1
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
And this is exactly why we design hardware-agnostic 'translation layers' to obscure those differences from the programmer. In your example it might be a compiler or virtual machine, and similar layers are available for GPGPU: OpenCL, BrookGPU (which is definitely not proprietary - it's BSD licensed and runs through D3D, OpenGL or on the main CPU, about as un-proprietary as it gets), DirectCompute (the most proprietary of them). This stuff is poised to make it big in the next couple years; it's been out long enough that it's stable, most of the hardware out there supports it, and there's lots of example code and expertise around now.The API is not owned by any of the hardware vendors. All code is proprietary if you really want to nit-pick - x86 code will not execute on a PowerPC or Cell processor
But in order to be able to interact with it, it must be dynamic and thus must be simulated. The Source engine has a system, separate from Havok, for displaying seemingly complex physics, but they're non-interactive. When gameplay demands interaction, that's when only dynamic physics simulation will suffice. It has little to do with accuracy.
Technically speaking, any non-interactive game physics is simply animation.
PhysX offers little in the way of interaction, many developers have flat out turned down PhysX simply because for the effects to be relevent to the game world causes a big slow down in processing.
Gabe at Valve speaks on this in several interviews, sure you can calculate an explosion with PhysX and get the pieces of bounce about, but how do you get the AI to interact with that, how do you make decisions in the game logic based on the results of PhysX simulation, you need to send a load of data back through the PCI Expres bus back to the CPU and in this respect it's a physics decelerator, having to make the CPU wait for physics reults basically doesn't work.
Look at something like the physics engine in ghostbusters, you blow a room to pieces and the AI can still dynamically navigate through the carnage so they dont get stuck, Ghostbusters does physics on the CPU and makes use of multiple cores.
And still is a piss poor "pseudo"-fix...fragments, like i stated before dissapear into thin air after 10 seconds...why must people always omit the facts when bashing PhysX?
Dynmaic path finding is a brilliant step forward quite frankly we've put up with bad pathfinding for years where physics objects would just be invisible to the AI and they'd just keep running into it over and over.
PhysX has potential it's just that it hasn't been met yet.
Take the new Batman game for example, the effects have been totally exagerated. You could do something very similar with little performance hit without physx, sure it wouldn't interact as well or look as good but it would work.
I keep hearing this..and it makes no sense.
If is that easy...why don't we see games doing it?
Wait, what? Games have been rendering cloth, steam, destructible environments, and sparks for over a decade.I keep hearing this..and it makes no sense.
If is that easy...why don't we see games doing it?
Because it seems TWIMTBP tag stops it.
I know the Phsyx effects will be more realistic with less performance hit and i accept that, but that doesn't mean similar effects can't be done.
I guess with the latest batman debacle it shows the developers do have bias towards TWIMTBP due to the fact they let nvidia create AA drivers for the game that don't work with ati. Yes Ati could have created their own drivers but the developers should never have agreed to a term where one company gets an advantage over another, it's all because nvidia sponsored the game. Basically if it was a neutral game this never would have happened.
Wait, what? Games have been rendering ANIMATED cloth, steam, destructible environments, and sparks for over a decade.
And it's very similar with little performance hit and doesn't use PhysX. Those were the defined conditions, which were met. Where's the problem?Fixed...
And it's very similar with little performance hit and doesn't use PhysX. Those were the defined conditions, which were met. Where's the problem?
Doesn't matter, it meets the proposed requirements. It's very simple to provide excellent, realistic environmental effects because real physics is easy to predict and model. You can't constantly change variables when I prove a point just because it disregards yours, otherwise I won't waste my time and you can continue to have a PhysX pity party.No interaction...it's stagantion.
The fact is while GPGPU will have an effect it seems talked about way to much and especially by those located in the nvidia love camp, almost as if that makes up for a lack of any actual product.
In reality a bad/good game will neither lose/gain much by having such a feature because in the end a good game is a good game which has for many always been about playability, in fact some of the best games people tend to like are rarely the most impressive in graphics, etc terms.
The most important word in this whole topic is the third word of the topic name....'could'.
The fact is while GPGPU will have an effect it seems talked about way to much and especially by those located in the nvidia love camp, almost as if that makes up for a lack of any actual product.
In reality a bad/good game will neither lose/gain much by having such a feature because in the end a good game is a good game which has for many always been about playability, in fact some of the best games people tend to like are rarely the most impressive in graphics, etc terms.
The most important word in this whole topic is the third word of the topic name....'could'.
Then ask AMD to provide it for you; you are paying enough for their HW.Sounds like you're the only one with sour grapes. Seriously, if Physx was as beneficial as you think it is, why are you one of the only ones defending it? Why would this thread need to exist? Didn't you post a link showing nVidia cards holding the vast majority of the market? If GPU physx is so prevalent, and so many people own Physx capable cards why do you you still need to try and convince people? Shouldn't the Physix 'revolution' be self sustaining now?
Now if you step out of ATECH fantasy land and into the real world you'll see the truth. The Physx API is widely used yes, but only the software. GPU Physx has not been widely adopted and many of the games that do have it, have exaggerated effects as has already been pointed out. Sadly for you, that doesn't appear to be changing. AMD can't supply their latest gen fast enough, their market share is growing, nVidia shrinking. If nVidia were smart, they'd drop the prices of their current cards. Sure they may take a loss, but they'll also lose far less market share, which means everything if you're trying to swing the market into a proprietary Physx solution.
I don't think anyone is against Physx, i think people are against a closed, non-standardized solution. Yes, that's right. We want our cake and eat it too, and why not, we're paying enough for the hardware.
I'd like to see the numbers to back this sort of statement up before I assume it's accurate. I do find it somewhat difficult to believe that PhysX-accelerated destructible environments, to use that as an example, would cause such severe slowdowns in performance because of limitations in PCIe's bandwidth or latency (which there is very little of) or waiting for the CPU to send back data (when your average game may only utilize only 40% of a quad core CPU on average).Gabe at Valve speaks on this in several interviews, sure you can calculate an explosion with PhysX and get the pieces of bounce about, but how do you get the AI to interact with that, how do you make decisions in the game logic based on the results of PhysX simulation, you need to send a load of data back through the PCI Expres bus back to the CPU and in this respect it's a physics decelerator, having to make the CPU wait for physics reults basically doesn't work.
UT2003 (UE2) had fully-simulated cloth used for flags and banners and whatnot. That was around seven years ago.Fixed...
UT2003 (UE2) had fully-simulated cloth used for flags and banners and whatnot. That was around seven years ago.
I don't exactly recall that running with playable FPS. I remember seeing the tech demos and they were closer to slide shows than playable.UT2003 (UE2) had fully-simulated cloth used for flags and banners and whatnot. That was around seven years ago.
Who said I hadn't?If you have not experienced PhysX, i find it difficult to see how you can comment on it with any authority
They were used in a handful of the stock maps. They didn't interact with the player, but they were dynamically-simulated.I don't exactly recall that running with playable FPS. I remember seeing the tech demos and they were closer to slide shows than playable.
Who said I hadn't?
They were used in a handful of the stock maps. They didn't interact with the player, but they were dynamically-simulated.
I made no comparison. I suggest you try to follow the discussion more adequately.No one. You are just making a terrible comparison of non-interactive CPU physics with interactive PhysX.
I've really had no quarrel with anything you've said in your past posts, but this statement seems to solidify the ridiculousness of your apparent position on PhysX. Of course there is room for comparison.There is no comparison. Just contrast
Sure, like Pong is a video game and Crysis is a video gameI made no comparison. I suggest you try to follow the discussion more adequately.
I've really had no quarrel with anything you've said in your past posts, but this statement seems to solidify the ridiculousness of your apparent position on PhysX. Of course there is room for comparison.
Simple example: non-player-interactive cloth physics simulation versus player-interactive cloth simulation. The latter is superior and more believable. The latter was not realistically feasible on older-generation hardware. They are both, however, cloth physics simulations. Simple, very direct comparison, yes?
This has been an interesting read, but the OP's video was "almost" awesome.
If they allowed me to use a gunship and shoot the hell out of all them little planes.. iiiiiiaaaaahhhhh!!!
Now that would be awsome.... I know I am going to dream that tonight. Get some, Get Some!!!!
Then ask AMD to provide it for you; you are paying enough for their HW.
- but then they dropped their "get in the game" program because they are broke. They don't support the devs.
When Fermi takes back the market share, what do you say then?
I don't disagree with this at all. There may be other players in the game later on, but right now, NVIDIA's done, and is still doing, what no one else has been able to. The potential benefits are startlingly obvious now that we're seeing it used as more than a mere gimmick.My entire point - that you appear to be missing - is that Nvidia has advanced the gaming State of the Art with PhysX - unlike with Havok CPU physics which is stagnation.
A fixed mounted twin .50 cal would be sweet.
Total carnage
Why GPGPU could have a big impact on gaming.
Now look at screen shots from [H]ard's review of Batman and tell me Physics doesn't look any better. If someone don't like it, I suggest they turn it off. But complaining about PhysX is like complaining about how AA lowers your frame rates.
But I think we'll be waiting some time (until 'baseline' hardware has the capabilities required) for any meaningful GPGPU to appear in games, unless someone is brave enough to pull a 'Crysis'.
Then ask AMD to provide it for you; you are paying enough for their HW.
- but then they dropped their "get in the game" program because they are broke. They don't support the devs.
When Fermi takes back the market share, what do you say then?
it's all opinion, isn't it?Well firstly Femi would actually have to appear before making any grand statements about its market success. Personally I've owned cards based around chips from 3DFX, ATI, Matrox, Nvidia, etc but at the end of the day I still believe what something could be is totally different from what it ends up being and there are quite a few closed systems that have proved that in the past. What I buy at the time I replace a GFX card it totally down to what is on the market at the time, not because I'm in one camp over another and not because of something like PhysX.
I tend to think a lot of people seem to also like glossing over the fact that PhysX isn't an original Nvidia product, it got it by buying Ageia who themselves got it from someone else.
To me it is an interesting technology that will quite easily disappear in the haze, for example I bought UT3 which had PhysX in it but to be honest I didn't notice a single thing as I flew around the maps which while being an early example i don't see that problem changing esp on certain games.
It is kind of ironic people trying to push an unproven closed technology and then adding in a unproven chip that has yet to appear to boost the aforementioned technology.
it's all opinion, isn't it?
i currently have HD 4870-X3 TriFire and also GTX 280 with 8800GTX for PhysX and i find it also ironic that people knock a technology just because they personally don't care for it; whether they have tried it or not.
PhysX was purchased by Nvidia. So what?
Nvidia has been developing it for the last two years; it usually takes about 2-3 years to get a new technology into a new game. i still don't think we have seen really games that are "built from the ground up" with PhysX yet. Mirror's Edge used very little of the tech, it looked "added" in; in Cryostasis, it became more important and Batman is the first triple A title to really use it.
-- there is a lot more of it coming.
batman is definitely not triple A title, its more like a A title.
and plus, Cryostasis plays like crap, their physX effect such as water is like rolling potato, not even kidding.... If you think that physX bring graphic to another level, I rather have static water effect like Bioshock have.
I been using physX for a while, and none of them really have a impact on PC gaming. There are limited title of physX includes game, but it still not bringing physX to the real market.
If you think physX works for you, ok ... but for most of us, NO.
I currently have a 8800 GTX in my rig to run physX, but I still haven't see a single game that actually utilize it.
Don't say batman, because its nothing impressive and it have major slow down when physX is on. Its totally not worth it to turn it on to see a laggy game.
The power behind this announcement is easy to see - now that Havok can run its APIs easily on either a CPU or GPU, the ability to accelerate physics in a way we had hoped would come about with technology like AGEIA PhysX is much more attractive. While currently today PhysX only runs on NVIDIA GPUs, OpenCL products will run on AMD and NVIDIA GPUs as well as Intel and AMD processors enabling heterogeneous computing algorithms across both product lines. This will also make it much likely that developers will take the additional time to program for such accelerated physics as the install base will have essentially doubled.
As for addressing the "effects physics versus gameplay physics" debate, the OpenCL implementation of Havok's APIs might help with this as well. The software developer will be able to query the system it is running on to determine how much processing power it actually has (based on predefined standards from an OpenCL "host") and adapt the algorithms accordingly. If a gamer has a slower CPU but a really fast GPU, for example, the physics models might be able to be increased dramatically; if a user has a higher end CPU but lower end GPU then the same algorithms could be run, just slightly slower, and likely adapted down in quality effects.
In the last months while reading here on [H] I have notced a trend regading GPU's
People seem to think they are "segmented".
What do I mean by that?
Simple, if a GPU has GPGPU features people whine about those features won't do squat for gaming.
That is however not the case.
Look at this video:
http://www.youtube.com/watch?v=Z-gpwCspxi8
It's CUDA...you know the stuff (some) gamers whine "won't help them in games in any way".
I would like to see this attempted run on a CPU.
And remember...everything visual in games are either based on physics...or approximations of physics.
As i said, it's all opinion. What are you using in your PC as your Primary GPU?
i use an GTX 280 as my primary display (in my Nvidia PC) and 8800GTX as my PhysX GPU and Cryostasis runs OK.