Why GPGPU could have a big impact on gaming.

There's real potential here. Most games today *are* simulations, and to feel and look right they need to be treated as such by the computer. There isn't really a 'good enough' in this industry; or we haven't reached it yet anyway.

The API is not owned by any of the hardware vendors. All code is proprietary if you really want to nit-pick - x86 code will not execute on a PowerPC or Cell processor
And this is exactly why we design hardware-agnostic 'translation layers' to obscure those differences from the programmer. In your example it might be a compiler or virtual machine, and similar layers are available for GPGPU: OpenCL, BrookGPU (which is definitely not proprietary - it's BSD licensed and runs through D3D, OpenGL or on the main CPU, about as un-proprietary as it gets), DirectCompute (the most proprietary of them). This stuff is poised to make it big in the next couple years; it's been out long enough that it's stable, most of the hardware out there supports it, and there's lots of example code and expertise around now.

I'm curious how a heavy GPGPU application would affect the resources available for good old rendering, and how this might be tuned by an application developer to work well on both low and high end hardware.
 
But in order to be able to interact with it, it must be dynamic and thus must be simulated. The Source engine has a system, separate from Havok, for displaying seemingly complex physics, but they're non-interactive. When gameplay demands interaction, that's when only dynamic physics simulation will suffice. It has little to do with accuracy.

Technically speaking, any non-interactive game physics is simply animation.

PhysX offers little in the way of interaction, many developers have flat out turned down PhysX simply because for the effects to be relevent to the game world causes a big slow down in processing.

Gabe at Valve speaks on this in several interviews, sure you can calculate an explosion with PhysX and get the pieces of bounce about, but how do you get the AI to interact with that, how do you make decisions in the game logic based on the results of PhysX simulation, you need to send a load of data back through the PCI Expres bus back to the CPU and in this respect it's a physics decelerator, having to make the CPU wait for physics reults basically doesn't work.

Look at something like the physics engine in ghostbusters, you blow a room to pieces and the AI can still dynamically navigate through the carnage so they dont get stuck, Ghostbusters does physics on the CPU and makes use of multiple cores.

Dynmaic path finding is a brilliant step forward quite frankly we've put up with bad pathfinding for years where physics objects would just be invisible to the AI and they'd just keep running into it over and over.
 
PhysX offers little in the way of interaction, many developers have flat out turned down PhysX simply because for the effects to be relevent to the game world causes a big slow down in processing.

I would like to see documentation for that.
As PhysX is the most USED physics API today.

Gabe at Valve speaks on this in several interviews, sure you can calculate an explosion with PhysX and get the pieces of bounce about, but how do you get the AI to interact with that, how do you make decisions in the game logic based on the results of PhysX simulation, you need to send a load of data back through the PCI Expres bus back to the CPU and in this respect it's a physics decelerator, having to make the CPU wait for physics reults basically doesn't work.

That works a hell of a lot better than using the CPU for physics...look at "the force unleashed"...or "gosht busters"...riggid bodies physics that goes awya after 10 seconds...not "suspension of disbelief"

Look at something like the physics engine in ghostbusters, you blow a room to pieces and the AI can still dynamically navigate through the carnage so they dont get stuck, Ghostbusters does physics on the CPU and makes use of multiple cores.

And still is a piss poor "pseudo"-fix...fragments, like i stated before dissapear into thin air after 10 seconds...why must people always omit the facts when bashing PhysX?
Dynmaic path finding is a brilliant step forward quite frankly we've put up with bad pathfinding for years where physics objects would just be invisible to the AI and they'd just keep running into it over and over.
 
PhysX has potential it's just that it hasn't been met yet.

Take the new Batman game for example, the effects have been totally exagerated. You could do something very similar with little performance hit without physx, sure it wouldn't interact as well or look as good but it would work.

Take the screenshot with the smoke/fog it would probably move more realistically around the character with physx, but i have seen plenty of games with smoke/fog - how hard would it have been to implement that into the game.

It is basically pointing out PhysX gets you this smoke and without it you can't come close, it is so biased - they could have created smoke that might not have interacted with the character like physx can but still had the effect, but instead it's smoke with physx and none without.

I guess that is what TWIMTBP gets you?
 
PhysX has potential it's just that it hasn't been met yet.

Take the new Batman game for example, the effects have been totally exagerated. You could do something very similar with little performance hit without physx, sure it wouldn't interact as well or look as good but it would work.

I keep hearing this..and it makes no sense.
If is that easy...why don't we see games doing it?
 
I keep hearing this..and it makes no sense.
If is that easy...why don't we see games doing it?

Because it seems TWIMTBP tag stops it.

I know the Phsyx effects will be more realistic with less performance hit and i accept that, but that doesn't mean similar effects can't be done.

I guess with the latest batman debacle it shows the developers do have bias towards TWIMTBP due to the fact they let nvidia create AA drivers for the game that don't work with ati. Yes Ati could have created their own drivers but the developers should never have agreed to a term where one company gets an advantage over another, it's all because nvidia sponsored the game. Basically if it was a neutral game this never would have happened.
 
Last edited:
I keep hearing this..and it makes no sense.
If is that easy...why don't we see games doing it?
Wait, what? Games have been rendering cloth, steam, destructible environments, and sparks for over a decade.
 
Because it seems TWIMTBP tag stops it.

More FUD...

I know the Phsyx effects will be more realistic with less performance hit and i accept that, but that doesn't mean similar effects can't be done.

Actually it does.
The same type of calculations (SIMD) that is used for physics are also used for other calcultations, take a look here:
http://golubev.com/about_cpu_and_gpu_2_en.htm

And perormance dosn't scale linear with more cores on a CPU either.

I guess with the latest batman debacle it shows the developers do have bias towards TWIMTBP due to the fact they let nvidia create AA drivers for the game that don't work with ati. Yes Ati could have created their own drivers but the developers should never have agreed to a term where one company gets an advantage over another, it's all because nvidia sponsored the game. Basically if it was a neutral game this never would have happened.

You mean like the HDR in FarCry 1.3?
Or DX10.1 in HAWX?
This happens all the time...people just seem to omit facts, forget fact or distrot facts when it coems to physcis processing.

Comparing limited simple rigid bodies physics that dissapears af 10 seconds to full fleged interactive smoke is mindboggling off target.
 
What e-geek is saying is that smoke, cloth, flying pieces all can be done without physx albiet it would be "animated" and not realistic. But instead of putting non-interactive fog they put NO fog to make the difference look more dramatic. Why not have both?
 
The fact is while GPGPU will have an effect it seems talked about way to much and especially by those located in the nvidia love camp, almost as if that makes up for a lack of any actual product.

In reality a bad/good game will neither lose/gain much by having such a feature because in the end a good game is a good game which has for many always been about playability, in fact some of the best games people tend to like are rarely the most impressive in graphics, etc terms.

The most important word in this whole topic is the third word of the topic name....'could'.
 
No interaction...it's stagantion.
Doesn't matter, it meets the proposed requirements. It's very simple to provide excellent, realistic environmental effects because real physics is easy to predict and model. You can't constantly change variables when I prove a point just because it disregards yours, otherwise I won't waste my time and you can continue to have a PhysX pity party.
 
The fact is while GPGPU will have an effect it seems talked about way to much and especially by those located in the nvidia love camp, almost as if that makes up for a lack of any actual product.

Oh that will change, mark my words...just wait untill AMD stops dicking around and deleivers with Bullet physics...suddenly GPGPU will be IMPORTANT, it will mater...it will benefit games ect...right now it just the "grapes are sour" posting going on.

In reality a bad/good game will neither lose/gain much by having such a feature because in the end a good game is a good game which has for many always been about playability, in fact some of the best games people tend to like are rarely the most impressive in graphics, etc terms.

The most important word in this whole topic is the third word of the topic name....'could'.

Games have progress in the the time they have been around.
Before the problem where that we had no where near the performance required.
Hell just faking a dead 3D world still pulls teeths out on even a morden system.
And that is just visuals...GPGPU physics is a powerhungry beast...but it is also the closest approximation we have to reality so far.

So unless you long fo the days of pong, you have to realize that physics has come to stay,

NVIDIA, AMD, Intel...they know it too...
 
The fact is while GPGPU will have an effect it seems talked about way to much and especially by those located in the nvidia love camp, almost as if that makes up for a lack of any actual product.

In reality a bad/good game will neither lose/gain much by having such a feature because in the end a good game is a good game which has for many always been about playability, in fact some of the best games people tend to like are rarely the most impressive in graphics, etc terms.

The most important word in this whole topic is the third word of the topic name....'could'.

agreed AMD *could* have workable physics in their games but they don't get off their lazy asses

The only people who talk negatively about PhysX are those on ATi graphics
:rolleyes:
 
Sounds like you're the only one with sour grapes. Seriously, if Physx was as beneficial as you think it is, why are you one of the only ones defending it? Why would this thread need to exist? Didn't you post a link showing nVidia cards holding the vast majority of the market? If GPU physx is so prevalent, and so many people own Physx capable cards why do you you still need to try and convince people? Shouldn't the Physix 'revolution' be self sustaining now?

Now if you step out of ATECH fantasy land and into the real world you'll see the truth. The Physx API is widely used yes, but only the software. GPU Physx has not been widely adopted and many of the games that do have it, have exaggerated effects as has already been pointed out. Sadly for you, that doesn't appear to be changing. AMD can't supply their latest gen fast enough, their market share is growing, nVidia shrinking. If nVidia were smart, they'd drop the prices of their current cards. Sure they may take a loss, but they'll also lose far less market share, which means everything if you're trying to swing the market into a proprietary Physx solution.

I don't think anyone is against Physx, i think people are against a closed, non-standardized solution. Yes, that's right. We want our cake and eat it too, and why not, we're paying enough for the hardware.
 
Sounds like you're the only one with sour grapes. Seriously, if Physx was as beneficial as you think it is, why are you one of the only ones defending it? Why would this thread need to exist? Didn't you post a link showing nVidia cards holding the vast majority of the market? If GPU physx is so prevalent, and so many people own Physx capable cards why do you you still need to try and convince people? Shouldn't the Physix 'revolution' be self sustaining now?

Now if you step out of ATECH fantasy land and into the real world you'll see the truth. The Physx API is widely used yes, but only the software. GPU Physx has not been widely adopted and many of the games that do have it, have exaggerated effects as has already been pointed out. Sadly for you, that doesn't appear to be changing. AMD can't supply their latest gen fast enough, their market share is growing, nVidia shrinking. If nVidia were smart, they'd drop the prices of their current cards. Sure they may take a loss, but they'll also lose far less market share, which means everything if you're trying to swing the market into a proprietary Physx solution.

I don't think anyone is against Physx, i think people are against a closed, non-standardized solution. Yes, that's right. We want our cake and eat it too, and why not, we're paying enough for the hardware.
Then ask AMD to provide it for you; you are paying enough for their HW.
- but then they dropped their "get in the game" program because they are broke. They don't support the devs.

When Fermi takes back the market share, what do you say then?
 
Gabe at Valve speaks on this in several interviews, sure you can calculate an explosion with PhysX and get the pieces of bounce about, but how do you get the AI to interact with that, how do you make decisions in the game logic based on the results of PhysX simulation, you need to send a load of data back through the PCI Expres bus back to the CPU and in this respect it's a physics decelerator, having to make the CPU wait for physics reults basically doesn't work.
I'd like to see the numbers to back this sort of statement up before I assume it's accurate. I do find it somewhat difficult to believe that PhysX-accelerated destructible environments, to use that as an example, would cause such severe slowdowns in performance because of limitations in PCIe's bandwidth or latency (which there is very little of) or waiting for the CPU to send back data (when your average game may only utilize only 40% of a quad core CPU on average).

Yes, when the complexity in a scene is very high, with hundreds of active rigid or soft bodies attempting to interact with multiple AIs, I can see how it can be an issue, but under typical circumstances? That I find a tad suspect.

UT2003 (UE2) had fully-simulated cloth used for flags and banners and whatnot. That was around seven years ago.
 
UT2003 (UE2) had fully-simulated cloth used for flags and banners and whatnot. That was around seven years ago.

it looks like it was done 7 years ago - and on a single core also.
--- If you have not experienced PhysX, i find it difficult to see how you can comment on it with any authority,

in 7 years, there has been little to no progress on the CPU physics front; Havok physics has not much improved over Painkiller :p
- Intel and AMD keep *talking* about CPU accelerated physic ,,, but that is all talk, so far.
 
UT2003 (UE2) had fully-simulated cloth used for flags and banners and whatnot. That was around seven years ago.
I don't exactly recall that running with playable FPS. I remember seeing the tech demos and they were closer to slide shows than playable.
 
If you have not experienced PhysX, i find it difficult to see how you can comment on it with any authority
Who said I hadn't?

I don't exactly recall that running with playable FPS. I remember seeing the tech demos and they were closer to slide shows than playable.
They were used in a handful of the stock maps. They didn't interact with the player, but they were dynamically-simulated.
 
Who said I hadn't?


They were used in a handful of the stock maps. They didn't interact with the player, but they were dynamically-simulated.

No one. You are just making a terrible comparison of non-interactive CPU physics with interactive PhysX. There is no comparison. Just contrast
 
No one. You are just making a terrible comparison of non-interactive CPU physics with interactive PhysX.
I made no comparison. I suggest you try to follow the discussion more adequately.

There is no comparison. Just contrast
I've really had no quarrel with anything you've said in your past posts, but this statement seems to solidify the ridiculousness of your apparent position on PhysX. Of course there is room for comparison.

Simple example: non-player-interactive cloth physics simulation versus player-interactive cloth simulation. The latter is superior and more believable. The latter was not realistically feasible on older-generation hardware. They are both, however, cloth physics simulations. Simple, very direct comparison, yes?
 
I made no comparison. I suggest you try to follow the discussion more adequately.


I've really had no quarrel with anything you've said in your past posts, but this statement seems to solidify the ridiculousness of your apparent position on PhysX. Of course there is room for comparison.

Simple example: non-player-interactive cloth physics simulation versus player-interactive cloth simulation. The latter is superior and more believable. The latter was not realistically feasible on older-generation hardware. They are both, however, cloth physics simulations. Simple, very direct comparison, yes?
Sure, like Pong is a video game and Crysis is a video game
- i can do "comparisons" like you do, also. :p

My entire point - that you appear to be missing - is that Nvidia has advanced the gaming State of the Art with PhysX - unlike with Havok CPU physics which is stagnation.
--i call it a contrast
 
This has been an interesting read, but the OP's video was "almost" awesome.

If they allowed me to use a gunship and shoot the hell out of all them little planes.. iiiiiiaaaaahhhhh!!!

Now that would be awsome.... I know I am going to dream that tonight. Get some, Get Some!!!!
 
This has been an interesting read, but the OP's video was "almost" awesome.

If they allowed me to use a gunship and shoot the hell out of all them little planes.. iiiiiiaaaaahhhhh!!!

Now that would be awsome.... I know I am going to dream that tonight. Get some, Get Some!!!!

A fixed mounted twin .50 cal would be sweet.

Total carnage :D
 
Then ask AMD to provide it for you; you are paying enough for their HW.
- but then they dropped their "get in the game" program because they are broke. They don't support the devs.

When Fermi takes back the market share, what do you say then?

Unless GPU Physx is widely adopted, I'll say the same thing. If all of a sudden a whole bunch of A+ titles start using hardware Physx when Fermi is released and the price of the card is reasonable, my next card may indeed be nVIdia. Chances of that happening are slim to none. Hardware Physx has been around for years now and games that support it are far and few in between, with GOOD games being even fewer than that. You think all of a sudden Fermi is going to change that, all the while nVidia is losing market share? Not likely. nVidia's position is weaker now than it has been since the days of the GeForce FX days. If they couldn't get hardware physx widely adopted when they were truely in a dominant position, they surly aren't going to be able to pull it off now.
 
My entire point - that you appear to be missing - is that Nvidia has advanced the gaming State of the Art with PhysX - unlike with Havok CPU physics which is stagnation.
I don't disagree with this at all. There may be other players in the game later on, but right now, NVIDIA's done, and is still doing, what no one else has been able to. The potential benefits are startlingly obvious now that we're seeing it used as more than a mere gimmick.
 
A fixed mounted twin .50 cal would be sweet.

Total carnage :D

YES!!!! All the GPGPU would not be enough to save them from a fixed mounted twin .50 cal!!!

Man, if they added that feature to that and sold it to NVIDIA as a freebie to their video cards, i would buy another GTX 260 just for that. l:D:D
 
Why GPGPU could have a big impact on gaming.

Lets hope so. :) DX11 do at least use the computational resources much more with compute shaders.

I don't people usually think of gaming when they talk about "general-purpose computing on graphics processing units (GPGPU)". GPGPU is more about using the GPU for other things then gaming for most I believe.

If you ask: "Will the GPU have big impact on gaming?" I think that most will answer: Yes, it does already.


Now look at screen shots from [H]ard's review of Batman and tell me Physics doesn't look any better. If someone don't like it, I suggest they turn it off. But complaining about PhysX is like complaining about how AA lowers your frame rates.

12558865907dHfoKNYgm_10_18_l.jpg

12558865907dHfoKNYgm_10_16_l.jpg

What does this prove? That Nvidia's PhysX is too crappy to give smoke and fog effects on CPU, so they turned it off? I think Toms hardware agrees with you there. It seems that by turning on physX it uses less CPU even when its not using the GPU for physx. PhysX can't be specially optimized for CPU:
http://www.tomshardware.com/reviews/batman-arkham-asylum,2465-10.html

Stalker's x-ray engine is obviously much better at this on CPU(1.26 and 2.10), but they are not using PhysX, so thats probably why its better:
http://www.youtube.com/watch?v=ykwoD_eH8zs
 
But I think we'll be waiting some time (until 'baseline' hardware has the capabilities required) for any meaningful GPGPU to appear in games, unless someone is brave enough to pull a 'Crysis'.

Which will be Crytek with Crysis 2. And even then it will be influenced by console limitations. Everyone knows where the money is, and LOLPIRACY isn't.
 
Then ask AMD to provide it for you; you are paying enough for their HW.
- but then they dropped their "get in the game" program because they are broke. They don't support the devs.

When Fermi takes back the market share, what do you say then?

Well firstly Femi would actually have to appear before making any grand statements about its market success. Personally I've owned cards based around chips from 3DFX, ATI, Matrox, Nvidia, etc but at the end of the day I still believe what something could be is totally different from what it ends up being and there are quite a few closed systems that have proved that in the past. What I buy at the time I replace a GFX card it totally down to what is on the market at the time, not because I'm in one camp over another and not because of something like PhysX.

I tend to think a lot of people seem to also like glossing over the fact that PhysX isn't an original Nvidia product, it got it by buying Ageia who themselves got it from someone else.

To me it is an interesting technology that will quite easily disappear in the haze, for example I bought UT3 which had PhysX in it but to be honest I didn't notice a single thing as I flew around the maps which while being an early example i don't see that problem changing esp on certain games.

It is kind of ironic people trying to push an unproven closed technology and then adding in a unproven chip that has yet to appear to boost the aforementioned technology.
 
Well firstly Femi would actually have to appear before making any grand statements about its market success. Personally I've owned cards based around chips from 3DFX, ATI, Matrox, Nvidia, etc but at the end of the day I still believe what something could be is totally different from what it ends up being and there are quite a few closed systems that have proved that in the past. What I buy at the time I replace a GFX card it totally down to what is on the market at the time, not because I'm in one camp over another and not because of something like PhysX.

I tend to think a lot of people seem to also like glossing over the fact that PhysX isn't an original Nvidia product, it got it by buying Ageia who themselves got it from someone else.

To me it is an interesting technology that will quite easily disappear in the haze, for example I bought UT3 which had PhysX in it but to be honest I didn't notice a single thing as I flew around the maps which while being an early example i don't see that problem changing esp on certain games.

It is kind of ironic people trying to push an unproven closed technology and then adding in a unproven chip that has yet to appear to boost the aforementioned technology.
it's all opinion, isn't it?

i currently have HD 4870-X3 TriFire and also GTX 280 with 8800GTX for PhysX and i find it also ironic that people knock a technology just because they personally don't care for it; whether they have tried it or not.

PhysX was purchased by Nvidia. So what?
:confused:
Nvidia has been developing it for the last two years; it usually takes about 2-3 years to get a new technology into a new game. i still don't think we have seen really games that are "built from the ground up" with PhysX yet. Mirror's Edge used very little of the tech, it looked "added" in; in Cryostasis, it became more important and Batman is the first triple A title to really use it.
-- there is a lot more of it coming.
 
it's all opinion, isn't it?

i currently have HD 4870-X3 TriFire and also GTX 280 with 8800GTX for PhysX and i find it also ironic that people knock a technology just because they personally don't care for it; whether they have tried it or not.

PhysX was purchased by Nvidia. So what?
:confused:
Nvidia has been developing it for the last two years; it usually takes about 2-3 years to get a new technology into a new game. i still don't think we have seen really games that are "built from the ground up" with PhysX yet. Mirror's Edge used very little of the tech, it looked "added" in; in Cryostasis, it became more important and Batman is the first triple A title to really use it.
-- there is a lot more of it coming.

batman is definitely not triple A title, its more like a A title.

and plus, Cryostasis plays like crap, their physX effect such as water is like rolling potato, not even kidding.... If you think that physX bring graphic to another level, I rather have static water effect like Bioshock have.

I been using physX for a while, and none of them really have a impact on PC gaming. There are limited title of physX includes game, but it still not bringing physX to the real market.

If you think physX works for you, ok ... but for most of us, NO.
I currently have a 8800 GTX in my rig to run physX, but I still haven't see a single game that actually utilize it.
Don't say batman, because its nothing impressive and it have major slow down when physX is on. Its totally not worth it to turn it on to see a laggy game.
 
batman is definitely not triple A title, its more like a A title.

and plus, Cryostasis plays like crap, their physX effect such as water is like rolling potato, not even kidding.... If you think that physX bring graphic to another level, I rather have static water effect like Bioshock have.

I been using physX for a while, and none of them really have a impact on PC gaming. There are limited title of physX includes game, but it still not bringing physX to the real market.

If you think physX works for you, ok ... but for most of us, NO.
I currently have a 8800 GTX in my rig to run physX, but I still haven't see a single game that actually utilize it.
Don't say batman, because its nothing impressive and it have major slow down when physX is on. Its totally not worth it to turn it on to see a laggy game.

As i said, it's all opinion. What are you using in your PC as your Primary GPU?

i use an GTX 280 as my primary display (in my Nvidia PC) and 8800GTX as my PhysX GPU and Cryostasis runs OK.
 
I lol at the people saying PhysX needs to be more open, while praising DirectX. Imagine if all your 3D games ran on essentially any system... They should make some sort of open graphics library that would enable that.


Generally speaking, I think realistic physics will help move things forward. Remember when you were just a dot in a video game? Remember when they added sprites so that you actually looked like a little guy? Remember when they went to actual 3D models? Remember when they went from being a few 3D cubes with a picture of a guy stuck on it to an actual skeletal model? Remember when they went from being a 3D model of a stick figure to actually having modeled details like hands and fingers? Realistic physics will move us further forward. The more accurately we can model the molecular interactions of the real world, the more realistic the resulting animations will be. In my eyes, our goal is the photorealistic renderings that are done on server farms for movies (or even better than that), but being able to do that dynamically in realtime as you play your game.


We're still chicken & egg with PhysX. Nobody wants to require it for a game because it's not supported by a lot of systems (ATI still has about 1/3 marketshare). It's not fully supported because nothing requires it.

If a huge game like Halo or WoW came out which required PhysX to fully enjoy the game, I bet ATI would reconsider using PhysX and/or be hard at work on a competitive option to it. I'm not talking fog that swirls around you instead of just being there, or curtains that flap really realistically. I'm talking about core mechanics of the game being much much better with, or even completely relying on, PhysX.

However, I see an OpenCL physics API coming out first. That would inherently support both camps as well as other devices. Devs will be much more supportive of something that may not run as well on certain hardware (but will indeed work), over something that only runs on one specific brand. I think an open physics API will do a lot for improving physics support in games. PhysX is a good start, but an open version will be needed before we see it become a major player.

http://www.pcper.com/comments.php?nid=6954 pretty much matches how I feel.
The power behind this announcement is easy to see - now that Havok can run its APIs easily on either a CPU or GPU, the ability to accelerate physics in a way we had hoped would come about with technology like AGEIA PhysX is much more attractive. While currently today PhysX only runs on NVIDIA GPUs, OpenCL products will run on AMD and NVIDIA GPUs as well as Intel and AMD processors enabling heterogeneous computing algorithms across both product lines. This will also make it much likely that developers will take the additional time to program for such accelerated physics as the install base will have essentially doubled.

As for addressing the "effects physics versus gameplay physics" debate, the OpenCL implementation of Havok's APIs might help with this as well. The software developer will be able to query the system it is running on to determine how much processing power it actually has (based on predefined standards from an OpenCL "host") and adapt the algorithms accordingly. If a gamer has a slower CPU but a really fast GPU, for example, the physics models might be able to be increased dramatically; if a user has a higher end CPU but lower end GPU then the same algorithms could be run, just slightly slower, and likely adapted down in quality effects.

Either that or we'll just do the usual and bow to Microsoft's closed, proprietary DirectCompute, so it'll work great on all your hardware, as long as you buy the latest version of Windows (which ends up requiring a hardware upgrade to run smoothly).
 
In the last months while reading here on [H] I have notced a trend regading GPU's
People seem to think they are "segmented".

What do I mean by that?
Simple, if a GPU has GPGPU features people whine about those features won't do squat for gaming.
That is however not the case.
Look at this video:
http://www.youtube.com/watch?v=Z-gpwCspxi8

It's CUDA...you know the stuff (some) gamers whine "won't help them in games in any way".
I would like to see this attempted run on a CPU.

And remember...everything visual in games are either based on physics...or approximations of physics.

Ask and ye shall receive: http://www.dcs.shef.ac.uk/~paul/publications/boids/index.html

Flocking isn't difficult, running it on a GPU is, frankly, a waste. Sure, that applet doesn't have 4,000 birds, but its also running in Java applet (and not using much CPU time either). I would not at all be surprised if running that demo using a CPU for flocking resulted in higher FPS due to avoiding the CUDA context switch hit.
 
As i said, it's all opinion. What are you using in your PC as your Primary GPU?

i use an GTX 280 as my primary display (in my Nvidia PC) and 8800GTX as my PhysX GPU and Cryostasis runs OK.

primary GPU is ATI HD 5870. (It was GTX 295 before I upgrade)

Cryostasis's physX effect is just crap TBH, the effect is just too unrealistic compare to static effect. Like the one I mention above "The Rolling potato style of water".
 
Back
Top