Rumor: AMD will be the GPU choice on all three next generation consoles

Most who even own smart phones are Iphone owners, the ones who go with Android or Windows Mobile are a minority (constantly increasing of course), and within that minority are many different manufacturers producing chips for phones. Nvidia would be smart to continue improving tech for consoles, it's a fast market that will turn over large profits in a short time rather than drizzle out phone tech over a few years.

Then again, Nvidia is a big company, they could easily do both.
 
I don't see how you could come to that conclusion. Hundreds of millions of chips sold for incredibly easy to produce tech (afterall, it's relatively low transistor count on old tech; very mature and very easy to fab).

Tablets don't come even close to this market yet.

ATi has sold over 140M chips for consoles in six years this Nov. That's 23.3M per year. That rivals smartphone sales (from a time well before the smartphone boom) and completely dwarfs tablet sales.

Add to the fact that pushing new tech is costly and its yields are lower.
I'd be curious to see what the economics truly look like. AMD is probably giving the console manufacturers a huge break on the chips, so margins are going to be pretty low. Even though the volumes are high and the tech is mature early on in the console's lifecycle, theres also the fact that such high volumes mean production capacity and wafers are being used for parts that may be much less profitable than what could otherwise be made on the same production line

Most who even own smart phones are Iphone owners, the ones who go with Android or Windows Mobile are a minority
nope, android has a pretty large lead in market share at this point
 
Last edited:
I don't see how you could come to that conclusion. Hundreds of millions of chips sold for incredibly easy to produce tech (afterall, it's relatively low transistor count on old tech; very mature and very easy to fab).
The graphics chips makers aren't getting much from console chips after the initial license. Despite all the griping it did, the deal Nvidia got on XBox 1 seemed to be unusually generous, and wasn't repeated in the current generation. The upfront payment for the current generation consoles is most of what the graphic chip maker gets over the console's life time.

AMD (and Nvidia) don't control manufacturing for the GPUs used in all 3 consoles, and aren't selling GPUs to the console makers as they do to AIB makers. The console makers have the right as part of the deal to manufacture those wherever they please.

Royalties from console sales are a few million dollars per quarter, a tiny portion of total graphic chip sales. It is of course better to get the deal than not, but it's not a huge source of revenues.
 
Last edited:
The graphics chips makers aren't getting much from console chips after the initial license. Despite all the griping it did, the deal Nvidia got on XBox 1 seemed to be unusually generous, and wasn't repeated in the current generation. The upfront payment for the current generation consoles is most of what the graphic chip maker gets over the console's life time.

AMD (and Nvidia) don't control manufacturing for the GPUs used in all 3 consoles, and aren't selling GPUs to the console makers as they do to AIB makers. The console makers have the right as part of the deal to manufacture those wherever they please.

Royalties from console sales are a few million dollars per quarter, a tiny portion of total graphic chip sales. It is of course better to get the deal than not, but it's not a huge source of revenues.

You know, in the Video Card forum thread about the Wii U GPU possibly being based on RV7xx, the ideal that AMD(ATi)/Nvidia actually sell the console makers GPUs constantly permeated the thread, even when people were corrected... Where do people get that idea?

Don't they realize that the consoles themselves are priced similarly to a whole graphics card, but contains a lot more? This is also why I don't know how people think the PS4/Xbox 3 will even use an Evergreen-based GPU. I kinda feel that all 3 will use 'RV7xx-esque' chips.
 
I'd be curious to see what the economics truly look like. AMD is probably giving the console manufacturers a huge break on the chips, so margins are going to be pretty low. Even though the volumes are high and the tech is mature early on in the console's lifecycle, theres also the fact that such high volumes mean production capacity and wafers are being used for parts that may be much less profitable than what could otherwise be made on the same production line


nope, android has a pretty large lead in market share at this point

The OS yes, but the hardware frag (yes I said it!) doesn't let Nvidia truely benifit from going mobile only.
 
I would be curious to know the economics as well and what the discount looks like per 1000 chip and is there number to buy you have to meet to get that discount.
 
Zarathustra[H];1037509761 said:
ATI has no say in the matter. Nvidia is the one that's blocking it.

Either way I don't get the point though, as PhysX is pretty much dead.

From what I understood, ATI CAN use it free with licence from nVidia, but ATI chooses not to.
Some games like batman AA and the next batman Arkham City use it well.
bummer.
 
Last edited:
From what I understood, ATI CAN use it free with licence from nVidia, but ATI chooses not to.
Some games like batman AA and the next batman Arkham City use it well.
bummer.
What are you talking about!? Are you talking about that crap where Nvidia offered to let AMD develop a compatibility layer for Cuda? If so that would be a terrible idea to actually accept that proposal. It is a terrible idea because Physx(probably all other apps) would run much better on Nvidia hardware than Nvidia, since the Cuda is designed around Nvidia cards. Ati cards are not similar enough for that to be a viable idea, if(big if) AMD cards could run it comparably in speed it still stands that AMD would be at the mercy of Nvidia if they decide to pull the rug out and go another direction.

He is another article I found on the topic.
http://www.extremetech.com/computing/82264-why-wont-ati-support-cuda-and-physx
 
Yep. And AMD is already beholden to Intel for all their technology, so it's not a mistake they're likely to make again.
 
What do you guys think about Kyle's article and the rumor that AMD will be the GPU for all 3 next gen consoles, could this be the final blow to PhysX? They would at least have to port to OpenCL.

http://www.hardocp.com/article/2011/07/07/e3_rumors_on_next_generation_console_hardware

I certainly hope so, AMD GPUs have been far more efficient and cost effective than the competition.

Also, PhysX is a joke. Modern CPUs can easily handle the load without the overhead of NVIDIA's software layer.
 
You know, in the Video Card forum thread about the Wii U GPU possibly being based on RV7xx, the ideal that AMD(ATi)/Nvidia actually sell the console makers GPUs constantly permeated the thread, even when people were corrected... Where do people get that idea?

Don't they realize that the consoles themselves are priced similarly to a whole graphics card, but contains a lot more? This is also why I don't know how people think the PS4/Xbox 3 will even use an Evergreen-based GPU. I kinda feel that all 3 will use 'RV7xx-esque' chips.

all depends and when they started the R&D process of the console, typically its around 2 years. so if they release in 2014 then most likely it will be an evergreen gpu. which means if they are planning a 2013 launch you would be looking at an RV700 based gpu like the Wii.

my hope is it doesn't get finalized til next year and sony/microsoft use the 28nm VLIW4 gpu's used.
 
all depends and when they started the R&D process of the console, typically its around 2 years. so if they release in 2014 then most likely it will be an evergreen gpu. which means if they are planning a 2013 launch you would be looking at an RV700 based gpu like the Wii.

my hope is it doesn't get finalized til next year and sony/microsoft use the 28nm VLIW4 gpu's used.

I'm going to guess 28nm will be used. I expect AMD and IBM to be the main vendors again, but ARM could be a wild card. The gpu architecture in consoles are typically heavily modified, to a point the starting architecture is almost gone.

On the software side, PhysX is useless on consoles. They use their own code bases and APIs. I'm not sure how much portability is their with Direct3D (D3D9EX) for Xbox 360 and OpenGL ES (not certain) for ps3.

Edit: I'm starting to forget. There is XBox 360 implementation of PhysX.
 
Last edited:
PS3 uses physX too http://news.cnet.com/8301-13924_3-10198809-64.html

AFAIK, the consoles just run physx on the CPU. Heck, even the IOS games using the UE3 engine can run physx http://www.tgdaily.com/games-and-en...00-first-physx-enabled-games-arrive-on-iphone

But I just thought of something, maybe nvidia will be FORCED to run physx on AMD hardware now that it will be in all the consoles.
I'm going to answer that with a NO. Nvidia will never relent. Though, they will gladly accept AMD's offer on licensing PhysX, using a CUDA-to-OpenCL driver/wrapper.
 
I'm going to answer that with a NO. Nvidia will never relent. Though, they will gladly accept AMD's offer on licensing PhysX, using a CUDA-to-OpenCL driver/wrapper.
I'm guessing nvidia will be feeling pressure from game devs now

the devs will basically be in the position to give nvidia an ultimatum: let us run physx on AMD hardware or else we will be using a different API.
 
Why would AMD license CUDA? They have stream processors on their cards they're essentially the same thing parallel processing. Nvidia have just been better at getting software support from devs that's all.
 
They would license it because they don't have their own technology, and are threatened by CUDAs popularity undercutting their market.
 
OpenCL is probably going to have more momentum if something like this is true. That being said, AMD has some serious problems with their OpenCL platform that is holding it back from general acceptance...CUDA is generally more reliable, stable across architectures w.r.t. performance, and easier to optimize for. AMD...unreliable, unstable, and difficult to get peak performance on their cards (industry experience on this one).
 
For those looking for physics on AMD cards, an easy example is DDO & LOTRO where AMD programmed basic water physics I think based on direct compute & dx11 for game developer Turbine. I would say this is on par with flying papers...

So yes AMD can do physics... AND it even runs on nvidia hardware.
 
CUDA is generally more reliable, stable across architectures w.r.t. performance, and easier to optimize for. AMD...unreliable, unstable, and difficult to get peak performance on their cards (industry experience on this one).

Can you show some links supporting your statement?

Which area are you talking about? Consumer, prosumer, professional, medical/scientific, etc.
 
Back
Top