Rumor: AMD will be the GPU choice on all three next generation consoles

Physx has never been anything but a buzzword, the few games that used it prolly never sold well otherwise they would have continued using Physx.

And like Kyle said it was more or less a business decision for all 3 of them then something hardware related. However I didn't see MS swap back to Nvidia.
 
could this be the final blow to PhysX? They would at least have to port to OpenCL.
Nvidia is too stubborn to do an OpenCL port, and OpenCL is irrelevant to consoles anyways. It would be possible for either Nvidia or game developers to port code to whatever native GPGPU APIs run on the consoles, or for another physics middleware developer to take advantage of the opportunity coming with next gen consoles.

3 for 3 posts so far have completely confused PhysX for GPU accelerated PhysX. :p PhysX is a multi-platform physics middleware engine that runs on a variety of CPU architectures. GPU accelerated PhysX is a Windows-centered implementation that is based on CUDA, which runs the same simulation in CPU code if GPU support is not available/enabled.

So no, this won't be "the final blow to PhysX". Seriously, was one single moment of thought put into that question?
 
I think this would be awesome! This would be a truckload of cash to AMD which should translate into more R&D in the CPU market which means better performing processors which means better competition and that's great for folks like us.
 
Nvidia is too stubborn to do an OpenCL port, and OpenCL is irrelevant to consoles anyways. It would be possible for either Nvidia or game developers to port code to whatever native GPGPU APIs run on the consoles, or for another physics middleware developer to take advantage of the opportunity coming with next gen consoles.

3 for 3 posts so far have completely confused PhysX for GPU accelerated PhysX. :p PhysX is a multi-platform physics middleware engine that runs on a variety of CPU architectures. GPU accelerated PhysX is a Windows-centered implementation that is based on CUDA, which runs the same simulation in CPU code if GPU support is not available/enabled.

So no, this won't be "the final blow to PhysX". Seriously, was one single moment of thought put into that question?

Look who is here! an Intel troll! There is only one PhysX. If your machine does not have a CUDA capable GPU and the program/game supports it then the PhysX will run on the CPU. Seriously stop looking for reasons to troll.

Tired of the name calling and trolling. Enjoy your week off. -Oldie
 
Last edited by a moderator:
Look who is here! an Intel troll!
Namecalling. Again. :rolleyes:

I wrote about why your question was irrelevant to the console rumor. Want to dispute what I wrote or just call names? I have yet to see you ever engage in any technical discussions, so I think I know the answer already.
 
Last edited by a moderator:
Look who is here! an Intel troll! There is only one PhysX. If your machine does not have a CUDA capable GPU and the program/game supports it then the PhysX will run on the CPU. Seriously stop looking for reasons to troll.
The Wii, PS3 and the 360 make use of PhysX. It is strictly 'software' PhysX, and is not the same as the PhysX middleware library on the PC. Even some ARM-based products license PhysX. So, as hard as it is for me to admit, pxc is right. :D
 
Namecalling. Again. :rolleyes:

I wrote about why your question was irrelevant to consoles. Want to dispute what I wrote or just call names? I have yet to see you ever engage in any technical discussions, so I think I know the answer already.

I just told you that there is only one PhysX. It can run on either the GPU (if you have a CUDA capable GPU) or the CPU. What are you talking about GPU PhysX is Windows centered? that is just because of driver support.
 
The Wii, PS3 and the 360 make use of PhysX. It is strictly 'software' PhysX, and is not the same as the PhysX middleware library on the PC. Even some ARM-based products license PhysX. So, as hard as it is for me to admit, pxc is right. :D

PhysX is all the same. The versions you are talking about just have different limitations.
 
I just told you that there is only one PhysX.
Incredible...

There is a PPU implementation of PhysX (no longer updated), a CPU implementation of PhysX (now in 3rd major revision), a CUDA based version available when a G80 or later GPU is installed, a Wii version of PhysX, a Cell-optimized version of PhysX on the PS3, a version for the XB360 and there are versions compatible with Mac and Linux.

If you had bothered to read the post you became RAGEON over, I pointed out that the first 3 posters confuse PhysX (like the versions that run on all current consoles and PC/Mac) with GPU accelerated PhysX (which is a PC-centric feature).

If anything, GPU accelerated PhysX is a minor part of PhysX overall, especially since the binary license is free. No, PhysX is not going anywhere. It's a much cheaper alternative to Havok Physics.
 
Incredible...

There is a PPU implementation of PhysX (no longer updated), a CPU implementation of PhysX (now in 3rd major revision), a CUDA based version available when a G80 or later GPU is installed, a Wii version of PhysX, a Cell-optimized version of PhysX on the PS3, a version for the XB360 and there are versions compatible with Mac and Linux.

If you had bothered to read the post you became RAGEON over, I pointed out that the first 3 posters confuse PhysX (like the versions that run on all current consoles and PC/Mac) with GPU accelerated PhysX (which is a PC-centric feature).

If anything, GPU accelerated PhysX is a minor part of PhysX overall, especially since the binary license is free. No, PhysX is not going anywhere. It's a much cheaper alternative to Havok Physics.

this. physx middleware is still the most widely used over havok and bullet. gpu physx as a subset will always remain a small niche since it is proprietary to nvidia gpus.

p.s. not really a rumor at this point; all three will most definitely use amd this go round. sony was the only one to use nvidia for the current gen.
 
Incredible...

There is a PPU implementation of PhysX (no longer updated), a CPU implementation of PhysX (now in 3rd major revision), a CUDA based version available when a G80 or later GPU is installed, a Wii version of PhysX, a Cell-optimized version of PhysX on the PS3, a version for the XB360 and there are versions compatible with Mac and Linux.

If you had bothered to read the post you became RAGEON over, I pointed out that the first 3 posters confuse PhysX (like the versions that run on all current consoles and PC/Mac) with GPU accelerated PhysX (which is a PC-centric feature).

If anything, GPU accelerated PhysX is a minor part of PhysX overall, especially since the binary license is free. No, PhysX is not going anywhere. It's a much cheaper alternative to Havok Physics.

Ok so can you explain to me the difference with these different versions of PhysX?
 
Nvidia is too stubborn to do an OpenCL port, and OpenCL is irrelevant to consoles anyways. It would be possible for either Nvidia or game developers to port code to whatever native GPGPU APIs run on the consoles, or for another physics middleware developer to take advantage of the opportunity coming with next gen consoles.

3 for 3 posts so far have completely confused PhysX for GPU accelerated PhysX. :p PhysX is a multi-platform physics middleware engine that runs on a variety of CPU architectures. GPU accelerated PhysX is a Windows-centered implementation that is based on CUDA, which runs the same simulation in CPU code if GPU support is not available/enabled.

So no, this won't be "the final blow to PhysX". Seriously, was one single moment of thought put into that question?

Open CL isn't irrelevant, Sony doesn't have DirectX or at least i thought.
 
Open CL isn't irrelevant,
On consoles I wrote. On consoles I wrote. And OpenCL (Open Computing Language) != OpenGL (Open Graphics Library). :p

There is no need for a general API like OpenCL on consoles. The hardware platform is fixed and native APIs can be exposed for maximum performance. It's how all consoles allow interfacing to GPU or other specialized hardware.

For example, people often say that the Wii and PS3 use OpenGL, but that's not accurate. Both use a graphics API somewhat similar in function to, but which is not OpenGL. Each of those uses a proprietary API that exposes hardware directly as possible. Even the XB360's DirectX is significanly different from the one that runs under Windows, exposing hardware at a lower level and without a graphics "driver".
 
On consoles I wrote. On consoles I wrote. And OpenCL (Open Computing Language) != OpenGL (Open Graphics Library). :p

There is no need for a general API like OpenCL on consoles. The hardware platform is fixed and native APIs can be exposed for maximum performance. It's how all consoles allow interfacing to GPU or other specialized hardware.

For example, people often say that the Wii and PS3 use OpenGL, but that's not accurate. Both use a graphics API somewhat similar in function to, but which is not OpenGL. Each of those uses a proprietary API that exposes hardware directly as possible. Even the XB360's DirectX is significanly different from the one that runs under Windows.

Ohhhh i see.
 
Ok so can you explain to me the difference with these different versions of PhysX?
They have different optimizations and aren't necessarily identical to the x86 CPU path or Windows GPU path. This stuff is even in the sdk announcements. C'mon.

From the latest one:
Focus on consoles and emerging gaming platforms.

PhysX SDK 3.0 was designed to be competitive on current-gen consoles and anticipates devices with even less system resources. These architectural changes include but are not limited to better overall memory management, improvements to cache efficiency, cross-platform SIMD implementations, intelligent SPU usage on PS3, multi-threading across multiple cores, and AltiVec/VMX optimizations on Xbox 360.

You claimed there was one single version, as if the PC version with CUDA and CPU (x87 default or SSE2 if compiled from source by a licensee) paths was used on all devices (seriously?). When it was shown there are specific versions tailored for multiple consoles, you dismissed that. So just keep moving the goal posts. :rolleyes:

At 1/2-1/3rd the price with source code of a Havok Physics license, PhysX isn't going anywhere. On Windows (and consoles/Linux/Mac?), a binary-only PhysX license is free. It's a professional quality physics middleware product for free. Get it now?

duby?
 
I think this would be awesome! This would be a truckload of cash to AMD which should translate into more R&D in the CPU market which means better performing processors which means better competition and that's great for folks like us.

Or more development in their GPU department, but either way I agree, it's great for AMD.
 
Open CL isn't irrelevant, Sony doesn't have DirectX or at least i thought.

I have a friend which did some Sony titles , the dev kit for the PS3 (videocard wise) he was rejoicing that it was very close to how you do stuff in DirectX (there are ofcourse optimizations ...).
 
AMDs have been more power efficient lately...so maybe this means the xbox wont be soooo damn hot and need a brick the size of a small microwave.
 
AMD has been doing overall better than nVidia on the whole lately, so it's not too surprising they'd carry the momentum into consoles. Pretty decent win for them if it comes true.
 
Since when does using ATI GPU's mean you can't use PhysX anymore? You just can't use the hardware accelerated version, which consoles don't anyways.
 
AMD has been doing overall better than nVidia on the whole lately, so it's not too surprising they'd carry the momentum into consoles. Pretty decent win for them if it comes true.
The XB360 and Wii currently use ATi, so the PS3 is the only change next generation if the rumor is true. It will be interesting how backwards PS3 compatibility, if any, works on the PS4. Not that BC will be an easy task for the other consoles, but it seems more possible on Wii 2 and XBN. The XB BC on XB360 was pretty underwhelming, while Wii's GC compatibility was very good.

Nvidia seems to have the opposite of momentum right now. :p Losing Apple and (possibly) Sony has to hurt because of how high profile those wins were. Tegra 2 tablets aren't doing well enough to make up for the loss of Apple. And that Sony bypassed Nvidia's Tegra 3 on Playstation Vita was a huge snub.
 
The XB360 and Wii currently use ATi, so the PS3 is the only change next generation if the rumor is true. It will be interesting how backwards PS3 compatibility, if any, works on the PS4. Not that BC will be an easy task for the other consoles, but it seems more possible on Wii 2 and XBN. The XB BC on XB360 was pretty underwhelming, while Wii's GC compatibility was very good.

Nvidia seems to have the opposite of momentum right now. :p Losing Apple and (possibly) Sony has to hurt because of how high profile those wins were. Tegra 2 tablets aren't doing well enough to make up for the loss of Apple. And that Sony bypassed Nvidia's Tegra 3 on Playstation Vita was a huge snub.

Even if Sony had went with an nvidia gpu and a more complex cell processor they still probably would not have BC for the PS3. They "might" do PS1 and PS2 but I doubt they make the same mistake they made with PS3.

They have already stated the Vita was primarily focused on ease of development which imo is awesome so hopefully they go with the bulldozer version Kyle mentioned. It would be a win/win for everyone.

A PS4 based around an entire AMD ecosystem would be pretty amazing.
 
They have already stated the Vita was primarily focused on ease of development which imo is awesome so hopefully they go with the bulldozer version Kyle mentioned. It would be a win/win for everyone.
Vita is the PSP successor based on a quad core ARM Cortex A9 processor and a PowerVR SGX543MP4+ (4 module, about 2x faster than the iPad 2's SGX543MP2 GPU, which is no slouch). It should be incredibly fast for a handheld.

Sony is not going with BD in PS4. ;) Using BD-based Trinity would make the PS4's performance weak even compared to Wii 2.
 
As much as I like bulldozer, moving to it -- or pretty much any x86 architecture -- would be a mistake. For games and media apps, the cell or similar stream like processors is what's important. x86 would be too complex and is too general of a processor, as least with the public info we have right now.

Also, an updated cell + new GPU would probably make it more easily backwards compatible.
 
Vita is the PSP successor based on a quad core ARM Cortex A9 processor and a PowerVR SGX543MP4+ (4 module, about 2x faster than the iPad 2's SGX543MP2 GPU, which is no slouch). It should be incredibly fast for a handheld.

Sony is not going with BD in PS4. ;) Using BD-based Trinity would make the PS4's performance weak even compared to Wii 2.

Can you get technical with that second statement please? I would have thought a set PC environment would be much more developer friendly than anything else especially cell.
 
Zoomer said:
Also, an updated cell + new GPU would probably make it more easily backwards compatible.

Even if it was the perfect hardware to do so I seriously doubt Sony will do it with PS3.
 
It's a real shame. I think Sony missed a good opportunity with the PS3, as people who had their PS2s conk out might be persuaded to buy a PS3 instead, if it could still run their entire library of PS2 games. Vendor lock in anyone? It's too bad they had to compete with the likes of Microsoft, but that's OT.

I wonder if the graphics will be a VLIW4 derivative, or be based on the GCN. Consoles don't need the excessive programmability of GCN/Telsa like architectures.
 
AMDs have been more power efficient lately...so maybe this means the xbox wont be soooo damn hot and need a brick the size of a small microwave.

Too bad it's a bit early for any console to use a fusion processor with special linux build. I can just imagine a 100watt chip with barebones motherboard delivering today's 6870 performance which is plenty for console's highest format 1080P60. Coding as well as cross platform coding would get so much easier. This would need 22nm though :/
 
Can you get technical with that second statement please? I would have thought a set PC environment would be much more developer friendly than anything else especially cell.
A PC architecture is developer friendly. The second statement had to do with Trinity's (dual/quad core Bulldozer based Fusion chips) competitiveness against Wii 2 and XBN GPU architecture.

Trinity uses a Cayman-based IGP and the shared memory architecture will likely not be faster against even Wii 2's HD4000 series GPU in speed. I took Sony's statement as a move more towards the XB360 and Wii 2 architecture: just multi-core CPU + GPU, not specialized Cell with CPU + multiple SPEs + GPU, and not that Sony would pull a XBox 1 (basically was a PC in console form factor). :p

What benefit would Sony get from using Trinity? Certainly not a significant CPU performance lead or anything even approaching last generation mid-level graphics performance (i.e. HD 5770 will still be much faster than Trinity's IGP). I doubt the PS4 would burn anywhere near 70-80W on the CPU portion alone, which is what Trinity would bring at near competitive performance. It's going to use a newer POWER/PPC architecture (with SMT... aka "HyperThreading" since that works well on consoles). :p Reinventing the wheel for toolsets and other code optimized for PPC to x86 isn't going to happen.
 
A PC architecture is developer friendly. The second statement had to do with Trinity's (dual/quad core Bulldozer based Fusion chips) competitiveness against Wii 2 and XBN GPU architecture.

Trinity uses a Cayman-based IGP and the shared memory architecture will likely not be faster against even Wii 2's HD4000 series GPU in speed. I took Sony's statement as a move more towards the XB360 and Wii 2 architecture: just multi-core CPU + GPU, not specialized Cell with CPU + multiple SPEs + GPU, and not that Sony would pull a XBox 1 (basically was a PC in console form factor). :p

What benefit would Sony get from using Trinity? Certainly not a significant CPU performance lead or anything even approaching last generation mid-level graphics performance (i.e. HD 5770 will still be much faster than Trinity's IGP). I doubt the PS4 would burn anywhere near 70-80W on the CPU portion alone, which is what Trinity would bring at near competitive performance. It's going to use a newer POWER/PPC architecture (with SMT... aka "HyperThreading" since that works well on consoles). :p Reinventing the wheel for toolsets and other code optimized for PPC to x86 isn't going to happen.

Oh that's not what I was talking about. I meant a Bulldozer CPU + Radeon 6XXX level GPU. Not some unified CPU/GPU combo, unless you still think that's a bad idea still?
 
Oh that's not what I was talking about. I meant a Bulldozer CPU + Radeon 6XXX level GPU. Not some unified CPU/GPU combo, unless you still think that's a bad idea still?

Too expensive, if the console is to be less than 400$ then you're looking at 100$ max for CPU+GPU. As far as CPU instructions, the console makers are in their own little world. Does any of them use x86 in their current gen console?
 
I don't think that nVidia really cares about the console market, it's not exactly a gold mine on the hardware side. I think they are far more interested in tablets as that's a bigger and more profitable market overall I would think. I do wonder about nVidia's interest in the dedicated GPU market because of tablets.
 
I don't think that nVidia really cares about the console market, it's not exactly a gold mine on the hardware side
I don't see how you could come to that conclusion. Hundreds of millions of chips sold for incredibly easy to produce tech (afterall, it's relatively low transistor count on old tech; very mature and very easy to fab).

Tablets don't come even close to this market yet.

ATi has sold over 140M chips for consoles in six years this Nov. That's 23.3M per year. That rivals smartphone sales (from a time well before the smartphone boom) and completely dwarfs tablet sales.

Add to the fact that pushing new tech is costly and its yields are lower.
 
I don't see how you could come to that conclusion. Hundreds of millions of chips sold for incredibly easy to produce tech (afterall, it's relatively low transistor count on old tech; very mature and very easy to fab).

Tablets don't come even close to this market yet.

ATi has sold over 140M chips for consoles in six years this Nov. That's 23.3M per year. That rivals smartphone sales (from a time well before the smartphone boom) and completely dwarfs tablet sales.

Add to the fact that pushing new tech is costly and its yields are lower.

Thanks for mentioning phones, I forgot to bring that up. Phones and tablets especially when Windows 8 hits will be a much bigger market than consoles. nVidia simply hasn't put the effort into consoles that it seems to be putting in mobile is all that I'm saying and I think they would rather be a player in the mobile space than consoles.
 
Back
Top