Sony Patent Fuels Speculation of PlayStation 5 Pro with Dual GPUs

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,594
Points
113
playstation-5-dualsense-pair-1024x576.jpg
Image: Sony



Nobody knows for sure if Sony will echo its previous generation with a more beastly console variant, but if a PlayStation 5 Pro does end up happening, it could be twice as powerful as the standard model. As spotted by T3, Sony Interactive Entertainment has published a patent alluding to a gaming console that leverages not one, but two GPUs.



“In a multi-GPU simulation environment, frame buffer management may be implemented by multiple GPUs rendering respective frames of video, or by rendering respective portions of each frame of video,” an abstract reads.



“One of the GPUs controls HDMI frame output by virtue of...

Continue reading...


 
Didn’t one of the MS consoles do this, and the second GPU was just the previous Gen for backwards compatibility?
 
Ugh, stop it. Normal humans cannot even buy a PS5 at full retail MSRP right now, and we're already talking about PS5 Pro?

And I'm not sure if SLI in a console is a good or bad thing. We all know what happened to SLI in PC land (it dead).
 
The original PS3 did that. Believe it originally shipped with the emotion engine (or whatever the ps2 chipset was called.)
Ja, because the PS2 is too complicated to emulate through software. PCSX2 does okay, but it is nowhere near close to looking and performing correct. With direct knowledge of the hardware Sony could probably do better, but no way was the hardware powerful enough in the PS3 to do it.
 
IIRC this is sort of how AMD did it when crossfire first appeared. You needed a master and a slave card. Or something like that
 
In the meantime, here’s a neat video that hints at what some of Sony’s PS5 revisions might look like.

I think you mean how PS5 revision wil NOT look like.

I'd love to see the "concept" PS5 render that is the closest to the real one. No one got it, not by any stretch.
And don't even get mi started with the SeriesX "concepts"
 
The main thing that killed PC SLI was Unreal Engine 4's incompatibility in DX11.
There is no reason a closed ecosystem couldn't massively benefit from multi-gpu.
 
The main thing that killed PC SLI was Unreal Engine 4's incompatibility in DX11.
There is no reason a closed ecosystem couldn't massively benefit from multi-gpu.
Consoles are no longer the closed ecosystems they used to be a few generations back. They use mostly the same engine now on console, including the unreal engine. If the engine would support sli rendering on a console there is no reason for it to not support it on PC, except for artificial barriers.
 
Consoles are no longer the closed ecosystems they used to be a few generations back. They use mostly the same engine now on console, including the unreal engine. If the engine would support sli rendering on a console there is no reason for it to not support it on PC, except for artificial barriers.
There's still quite a bit more to it than that.

Technically, SLI (and CFX) are supported today.

The main issues aren't really any different than they ever have been; they're just currently not worth developers' time to address, and so much so that hardware vendors have stopped trying themselves.

However, that changes in a 'closed' ecosystem. If the PS5 Pro is simply a PS5 + second GPU die, then developers have no choice but to broach the topic. A big part of that would be figuring out split-frame rendering.

Which gets us to why the technology has been mostly abandoned on the desktop: the most direct solution for multi-GPU is alternate-frame rendering, which when well implemented as say by Nvidia, results in a ~90% increase in framerates as well as about a frame or so of input lag.

When implemented poorly, see most of ATI/AMDs efforts up until the last five years or so, you get worse performance in terms of frametimes along with all of the input lag.

This is what Sony can potentially 'fix' by forcing developers on to a specific solution. If they can get split-frame rendering working well where the rendering load for the current frame is truly shared between rendering hardware units without incurring an input lag penalty, then they have a chance at making it successful.

Just don't go thinking that such a solution would carry over to the desktop. Sony is likely to use some riced-up interconnect and caching scheme to make this work as it is both highly bandwidth and latency sensitive and likely won't port cleanly.

Now, that isn't to say that it can't be done; rather, it will take hardware, OS, and software developers to commit to the same level of effort that they've been avoiding these past few years. And to that end, it will take a real motivating factor, something like an accessible VR headset that does 4k120 per eye or someone actually finding 8k gaming to be useful or something.
 
Maybe the second GPU would be something small, but entirely dedicated to machine learning based upscaling. That would probably be more beneficial to a console than trying to brute force their way to native resolution.

It could also be how they're planning on doing PS VR2 - hybrid system that can work like a dedicated mobile console (ie: Vita2) or can interface with the PS5 for VR software.

Fun to theorize what it could be, anyway.
 
Hey Sony, how about dual SSDs instead of GPUs, you hosers? You know, so we can have more than 3 games installed at once? :mad:
Turns out that apparently the PCI-E controller Sony uses for storage isn't exactly standard, which is why the expansion slot is currently disabled. Maybe it isn't better than Microsoft after all in that Sony is probably extorting storage manufacturers for money to be certified compatible with the PS5.
 
Turns out that apparently the PCI-E controller Sony uses for storage isn't exactly standard, which is why the expansion slot is currently disabled. Maybe it isn't better than Microsoft after all in that Sony is probably extorting storage manufacturers for money to be certified compatible with the PS5.

Well I'm certain you need to pay a licensing fee to get officially endorsed by Sony. But if (and that's a big contingency on my next statement - as it hasn't been proven one way or the other, apart from Sony in the past claimed it was standard) -- it is non-standard, then yeah, that's dirty pool. Double dirty, since it uses a standard m.2 and you would think you could plug just any old thing in there... I hate it when people use proprietary protocols with industry standard connectors/plugs/etc.
 
There's still quite a bit more to it than that.
Thank you for the wall of text, but you are not convincing me. If the developer takes the time to optimize their game for sli rendering, there is no good reason to not carry it over between platforms.
Technically, SLI (and CFX) are supported today
.
They really are not, because there is no effort put in to make it work. If the effort is put in to make it work on one platform, then it is a much smaller task to carry that over to another.
 
Well I'm certain you need to pay a licensing fee to get officially endorsed by Sony. But if (and that's a big contingency on my next statement - as it hasn't been proven one way or the other, apart from Sony in the past claimed it was standard) -- it is non-standard, then yeah, that's dirty pool. Double dirty, since it uses a standard m.2 and you would think you could plug just any old thing in there... I hate it when people use proprietary protocols with industry standard connectors/plugs/etc.

AFAIK it is standard, but it needs to be at least as fast as the internal Sony SSD so they can guarantee that it will work as well as theirs, your statement about any old one is the reason, if Sony devs optimise their games for a certain lvl of performance and you plug in an SSD with half that performance things might run realy realy bad and you would be complaining about that then.
 
Thank you for the wall of text, but you are not convincing me. If the developer takes the time to optimize their game for sli rendering, there is no good reason to not carry it over between platforms.
A hypothetical PS5 Pro would be one hardware target that required implementation.

Where the failure to carry over could come from is that they will be using Sony's hardware and Sony's APIs. The hardware may or may not be representative of PC multi-GPU, but the APIs definitely aren't.

Now, I do want to point out that I agree with you on one part: if developers do their part, that's definitely a significant step closer to having support on the desktop. There's just a pretty big gap of unknowns to bridge afterward.

They really are not, because there is no effort put in to make it work. If the effort is put in to make it work on one platform, then it is a much smaller task to carry that over to another.
You can run a game with a current operating system, hardware, drivers and so on with multi-GPU enabled.

But yes, support is limited to a handful of games at best, and to the very highest-end hardware (in Nvidia's case, don't know about AMD at the moment). Main point is that we haven't gone backward in terms of componentry, just industry interest, which I think is what you're saying.

And overall I do agree that Sony broaching the topic is good for overall interest!
 
Ever since multi-gpu was invented, we have been promised a way for multigpus to work as one. Apparently Intel is working on such a solution but AMD/nvidia have dropped it at least for now.

I really don't see Sony going that route, they are already as fast/faster than the SeriesX.
 
Yeah I don’t see why they would multiGPU anything current gen - I can only see it as a backwards compatibility crutch.

it won’t lower your power budget, if beefing up the GPU in the APU isn’t feasible because of die size, it would be more economical to just use a single dGPU anyway. And you don’t want to do anything that significantly increases the burden on the developers (beyond what is already par for all consoles of the current Gen)

Sorry, SLI is dead
 
Become a Patron!
Back
Top