Now that we have the PPU, what is next??

Bloodgod42

Gawd
Joined
Nov 20, 2005
Messages
592
As I understand correctly, we started with a CPU doing basically ALL the computers processing. Then as we advanced...

-for improved graphics, we designed the GPU to handle video processing.
-for improved audio and easing the audio processing preformed by the CPU we have cards like X-Fi w/64mb.
Last, but not least...
-for improved physics we have the PPU.

These cards were all created because of a need. The power the CPU had for each was too limiting, and thus holding back the progress of certain technologies.

My question is this...What will the next CPU process to gets it's own dedicated card? What else IS there?!?!

Thanks for any input!
Bloodgod42
 
I read somewhere about someone looking into A.I hardware...

Terra - But it was a while back...
 
I'd have to say that AI has to be the weakest part of games - not physics or graphics. It lags far behind both.

Recently, I noticed a return to single player games like Oblivion and Civiliziation IV. That's a good trend in my opinion. It seemed like 2 years ago, the single player game was almost abandoned. Don't get me wrong, I enjoy BF2 but I also like my single player games too.

The current digital CPU is not particularly well designed for AI. Perhaps a CPU that's not based on 0s and 1s would be better suited to AI.
 
Since all forms of AI knowadays are just elaborate programming, I don't really see why it would be necessary to create specialized hardware for it. It's only until we can create true AI that traditional CPUs would no longer suffice. Research has a long way to go until then.

The only dedicated hardware that I can think of is that which caters to our senses but doesn't really exist in the mainstream market. We have graphics cards for our sight and soundcards for our ears, but what about our other three senses? We've got 'force-feedback', but that's mostly implemented by controller resistance or vibrations - not true tactile feedback.

Basically, I don't see any new dedicated hardware designed to offload tasks traditionally handled by the CPU. Rather, any new hardware is likely to be either an interface/networking technology, miniturization of current technology, or something truly revolutionary that CPUs aren't designed to handle.
 
Yeah, AI isn't really a problem you can just throw more processing power at. All the hard work that goes into making good AI good is on the programmer's end. The only practical use for AI acceleration I can see at this point is incredibly large-scale stuff (like Rome:TW, or maybe Oblivion if it continually tracked all the NPCs), but the vast majority of games couldn't even find a use for the hardware.

LordBritish said:
Perhaps a CPU that's not based on 0s and 1s would be better suited to AI.
I think you're right. I've been thinking for a while that to truly simulate human thought we'll need analog computers. How such a thing would work, I have absolutely no idea...
 
Holodecks!

The rise of dual and rumor of quad core on the horizon will hopefully be enough to allow for massive improvement in AI.

However, a massively parrallel architecture like with physics and graphics to allow it to calculate behavior for all of the different actors would be interesting. The thing about it is that the PPU and GPU's work out well because it is implementing specific mathmatical functions in hardware. AI is at a very high architecture level, you can't represent human behavior generically with equations. I'm not expert but cannot see this being implemented at the hardware level.
 
We're not going to see AI improvements on the hardware front any time soon. AI for games is often limited by the programmers comprehension far before it is limited by the hardware. AI experiments done by computer scientists thats not for gaming often involves large super computers, but that research doesnt lend itself to games at all these days.
 
LordBritish said:
The current digital CPU is not particularly well designed for AI. Perhaps a CPU that's not based on 0s and 1s would be better suited to AI.
Dont' hold your breath, we would probably need neural nets to finally move away from digital. We would need to start using biological processes of some sort probably, binary is hard to escape when you use silicon.

The biggest problem with trying to offload AI would be the sheer amount of processing power needed, that not even a dedicated card could produce. Try looking at a computer playing chess vs. a human, they have to use a dedicated start and end game book. The possible moves increase exponentialy, and even though computers have been doubling up in power, the best human can still beat the best computer.

It would be like taking a 100,000 foot tall cookie which needs to get eaten in 1 minute by 1 person and making it only 50,000 feet tall. How much of a difference would it make since you aren't even close to finishing it? This is a similar problem to AI in games, there are only a couple of possible conclusions a human would make in a situation, but not having nearly enough time to compute those decisions, the game character has to go step by step and hoping it takes the right path. Doubling the speed of the computer would perhaps add a few more looks into possible actions, or may even allow another step in the future, but it means almost nothing.
 
Lunar Wolf said:
I think Physics intergrated onto the motherboard or video card is next.

Not going to happen any time soon.

Sound cards have barely reached true onboard status - and no the AC97 codec doesn't count.

GPUs have reached TDP's of CPU levels - adding more heat onto an already cramped board probably wouldn't be the best idea - hence why all the current gpu solutions are trying to utilize what's already there. You don't see Nv/ATi jumping up and down about adding dedicated physics pipelines to their architecture. Yet.
 
Lunar Wolf said:
I think Physics intergrated onto the motherboard or video card is next.

Motherboard companies aren't willing to do this, and it would be a terrible idea. On some higher end motherboards they have started with SB Live chips (not that great), but any mb implementation would not be top of the line due to cost. People like being able to upgrade components, and you can't satisfy everyone's needs by putting too many things on a motherboard and charging an arm and a leg.

They already have onboard video which is no where near as fast as the external cards, why aren't we all buying them? Oh ya, they're slow as snails.
 
LuminaryJanitor said:
Yeah, AI isn't really a problem you can just throw more processing power at. All the hard work that goes into making good AI good is on the programmer's end. The only practical use for AI acceleration I can see at this point is incredibly large-scale stuff (like Rome:TW, or maybe Oblivion if it continually tracked all the NPCs), but the vast majority of games couldn't even find a use for the hardware.


I think you're right. I've been thinking for a while that to truly simulate human thought we'll need analog computers. How such a thing would work, I have absolutely no idea...

One example is a quantum computer, something that utilizes all 22 states of the atom. I'm not even sure something like that is in development though :eek:
 
no, "1s and 0s" is only a small issue. The major problem is the inability of researchers to be able to figure out what really makes something intellegent. Even if you used an analog computer, devices which do exist, you still have the problem, you have to figure out or make an assumption on what the process of thinking is.

LordBritish said:
The current digital CPU is not particularly well designed for AI. Perhaps a CPU that's not based on 0s and 1s would be better suited to AI.
 
Yep. The decision process is somewhat calculation based at a certain level, but, theres still a whole lot more to do with it than simple if - then, else decisions.
 
Emission said:
One example is a quantum computer, something that utilizes all 22 states of the atom. I'm not even sure something like that is in development though :eek:
But quantum computers still operate on discrete states. They may be fundamentally different from modern computers, but nowhere near as different as the human mind.
 
Very interesting input everyone, thank you.

I would have to say AI is surely a weak area, but after seeing advances such as in FEAR, I think we will have realistic AI (for games) within 5 years. Our definitions of "realistic AI" will vary here, but if we are talking about games, then I think the mainstream CPU's we have in 5 years (5GHz+ dual core probably) will be able to provide ample power. You need to take into consideration that realistic behavior in a game would be far easier to recreate than "true" human-like intelligence.
 
Bloodgod42 said:
...I think the mainstream CPU's we have in 5 years (5GHz+ dual core probably) will be able to provide ample power. You need to take into consideration that realistic behavior in a game would be far easier to recreate than "true" human-like intelligence.
True, although non-binary computers may be better suited, they're certainly not necessary for the purposes of game AI.

But I think at the moment the limitation on AI is in the code itself, not the hardware it's runnning on, and I don't expect this will change any time soon.
 
Like mentioned, programmers are having a tough time defining how the mind works. It DOES pretty much all come down to the programming.
 
Emission said:
Like mentioned, programmers are having a tough time defining how the mind works. It DOES pretty much all come down to the programming.

Or putting 3.7 billions year of instincts and +20 year of life experince down into software ;)

Terra...
 
Trouble is you don't want AI that's too clever either as the computer would start beating you just through speed of 'thought' and there is the continued problems with AI causing unwanted affects. Like in beta testing of Oblivion where the NPC's were going round killing each other to get money because they needed to eat.

That said If someone could write an easily trasnsportable AI framework (like Havoc with physics) with the potential to be hardware accelerated then they would probably become quite rich very quickly. The military would buy it up as much as the games publishers, just think about AI in that new Darpa challenge :)
 
Better AI doesn't necessarily need to be smarter, just more convincing and less predictable. You certainly don't want something that'll consistently outsmart you, but you need something a bit more dynamic than a handful of scripted actions tied together (or at least something where this isn't so blatantly obvious...).

And you're right - middleware is exactly what is needed to drive the development of AI. The kind of things a dedicated AI developer could do simply wouldn't be worth the time and effort (= $) for a game developer. It would also encourage competition, something which seems to be missing entirely at the moment...
 
"What is next to be taken from the CPU?"

Nothing </world>
 
New cpu's that will have two specific cores: one to processes the 0's, one to process the 1's.
 
Cell processors.

Basically MMX and SSE suck balls and that is where the Cell processor comes in. The Cell processor could handle everything from rendering, collision detection, physics, audio, video encoding/decoding, you name it.

It won't be too long before you have a Cell processor in your PC instead of all that specialized hardware.
 
I believe their were recent reports about IBM saying the processor they designed for the PS3 was no better then the 3 Core Xbox 360 processor? It has 7 usable cell processors and PPC core last I knew, and its just about equal? :p
 
Xipher said:
I believe their were recent reports about IBM saying the processor they designed for the PS3 was no better then the 3 Core Xbox 360 processor? It has 7 usable cell processors and PPC core last I knew, and its just about equal? :p

The one used in the ps3 didn't have all of the cells. And you can always improve the cells over time.
 
All the components of the PS3, everything down to video, network, I/O interfacing, is handled by groups of cells. If you need more processing power, just throw more cells at it :rolleyes: , least thats how its conceptualized.
 
I'd like to see more updated ram drives from a variety of companies. I would love to have a 16GB ram drive faster than 300MB.
 
good luck getting the programming..

This is what intially gave developers a scare, and one of the reasons why the release date got pushed back.

I'd like to see more updated ram drives from a variety of companies. I would love to have a 16GB ram drive faster than 300MB.

Exactly what I also wanted to see. SATA150 is quite a bottleneck for the 3.2+ GB/s possible from an updated version of the card. In a few months or so they should have updated versions with higher frequency compatability and higher interface speed. I thought it was an incredible idea, so I REALLY want to see more companies expanding on it.
 
Emission said:
This is what intially gave developers a scare, and one of the reasons why the release date got pushed back.

are you saying its a non-issue now?
 
How about a card that plays the game for you so redundant fetch quests in RPG can be done while sleeping. I know it's cheating, but if I have to find another something for someone to get something to give to someone else I think i'll need some thorazine and one of those funny jackets.

In all seriousness it would be nice to have a card for voice recognition. That I imagine would benefit from a specifically designed processor.
 
Iratus said:
Trouble is you don't want AI that's too clever either as the computer would start beating you just through speed of 'thought' and there is the continued problems with AI causing unwanted affects. Like in beta testing of Oblivion where the NPC's were going round killing each other to get money because they needed to eat.

For me, the best way to make realistic AI is to have them make logical mistakes. That includes giving them senses to also process. Throw a grenade in the bushes, and they react to the sound instead of simply grenade = investigate, or spray painting something resembling your silhouete on a wall and have them not realize it until they're close enough to notice it's paint.

Of course, i have no freaking clue how you're supposed to code that without using fixed (e.g.scripted) parameters.

The Oblivion part you mentioned is a step in the right direction tho. It's a perfectly logical AI. They just had to raise Ethics parameter a few notches :D

That said If someone could write an easily trasnsportable AI framework (like Havoc with physics) with the potential to be hardware accelerated then they would probably become quite rich very quickly. The military would buy it up as much as the games publishers, just think about AI in that new Darpa challenge :)

You do realize the potential if you could refine the pathfinding AI's you see in games and simply release tank drones into another country right? Some CS bots don't use waypoints and simply figure out the map themselves, very heavy on the processor, but the potential is incredible.

You do also realize the Ethical implications of it right? It cheapens the meaning of war if you've got a big mouth and then hide behind your machines. "He's pissing me off, let's throw some drones at them."

Or, god forbid (And i'm not even religious), they decide to make a centralized network and we end up with Terminators.
 
Back
Top