OptiScaler Update Brings AMD FSR 4 to Any DX 11/12 Title and Upcoming RDNA 5 GPU Is Rumored to Feature 96 CUs with 384-Bit VRAM Bus

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
9,714
Points
113
The team behind OptiScaler has successfully integrated support for AMD FSR 4 in a new update, allowing it to be used in any DX11/12 game. Meanwhile, a new rumor claims that an upcoming AMD UNDA/RDNA 5-based GPU will feature a memory bus similar to that of NVIDIA's RTX 4090.

See full article...
 
This is ofc speculation by Kepler, but he makes a fantastic point

In the next gen a console (xbox?) could have 2 different chiplets (one for cpu & one for gpu) fused together

What he is asking is could AMD be re-using the console gcd chiplet itself as its top RDNA 5 card (navi 51?)
 
I am of course curious how this will work... but I'm also not expecting a lot here. AMD needs to step up and show for consumer they are a prime time competitor and not the UHF holdout.
 
Last edited:
Well, if the RTX5070 performs like a RTX4090, clearly the Xbox custom APU could to it too, right?... right? :rolleyes: :rolleyes:
The xbox could be revealed/launched next november

By then we would have 1 year of the 5080 super being on the market. If the xbox comes close to that performance then the msrp of the xbox could track very close to the street price of the 5080 / 5080 super at that point in time (next year end)
 
So, AMD couldn't make a 9000 series card to compete with the RTX4090, but suddenly they can build an APU that can match it?
 
So, AMD couldn't make a 9000 series card to compete with the RTX4090, but suddenly they can build an APU that can match it?
We're talking production cycle timelines and development timelines here. You think they made the 9070 chips and said.. "You know what... Job's done... everyone take a year or two off to chill."

No before the 9070 was ever out they were moving on and creating what was next. They clearly reached a determination about the capabilities of the current gen and went a different route again. But we're talking about something coming in 2026... most likely, they can learn from each other.

Now do I think they will have a 5080 equivelant graphics performance next year as part of an APU? Ehhhhh maybe. I'm more just saying that it isn't outside of the realm of possibility.

Now if MS and Sony said... make us an APU that makes us compete with X goal... and if you do it we will make you our prime CPU and GPU supplier for the next series of Data Centers we are making or upgrading for AI processing and cloud services... then AMD would double the eff down and get it done. They would be foolish not to. That would help alleviate the risk of loss on development if they had guaranteed sales.

But nobody wants to do that... because the risk is very high.
 
We're talking production cycle timelines and development timelines here. You think they made the 9070 chips and said.. "You know what... Job's done... everyone take a year or two off to chill."

No before the 9070 was ever out they were moving on and creating what was next. They clearly reached a determination about the capabilities of the current gen and went a different route again. But we're talking about something coming in 2026... most likely, they can learn from each other.

Now do I think they will have a 5080 equivelant graphics performance next year as part of an APU? Ehhhhh maybe. I'm more just saying that it isn't outside of the realm of possibility.

Now if MS and Sony said... make us an APU that makes us compete with X goal... and if you do it we will make you our prime CPU and GPU supplier for the next series of Data Centers we are making or upgrading for AI processing and cloud services... then AMD would double the eff down and get it done. They would be foolish not to. That would help alleviate the risk of loss on development if they had guaranteed sales.

But nobody wants to do that... because the risk is very high.

If I'm not mistaken, so far, there hasn't been a console that beats the prior gen top of the line GPU. Not saying it can't happen, but odds are against it.

Even taking the current 9070XT and "shrinking" it to fit an APU would be a monumental task.
 
If I'm not mistaken, so far, there hasn't been a console that beats the prior gen top of the line GPU. Not saying it can't happen, but odds are against it.

Even taking the current 9070XT and "shrinking" it to fit an APU would be a monumental task.
It really depends on if AMD has some... 'new' method. Time will tell.
 
If I'm not mistaken, so far, there hasn't been a console that beats the prior gen top of the line GPU. Not saying it can't happen, but odds are against it.

Even taking the current 9070XT and "shrinking" it to fit an APU would be a monumental task.
At this point I'm assuming the biggest improvemnts will come from "fake" frames and "AI" crap and not as much from real advancements in silicon.
 
So, AMD couldn't make a 9000 series card to compete with the RTX4090, but suddenly they can build an APU that can match it?
I'm kinda in this camp, except for a different reason == how many 12VHPWR lines is the APU gonna require? No way that's happening in a console form factor where people will complain about it needing external power bricks and sounding like a jet plane.

Physics is a bitch, and I don't think there will have been that many node advances between the 4090 and whenever this releases.
 
I honestly believe that AMD has had issues with scaling GPU performance with the chiplet design. Physics as Brian_B said are holding them back. Traces between chips on a PCB can only move data so efficiently and so quickly. SO they have been held back.

Going with a complete SOC a la the Apple M series systems is clearly a solution here. Look at what they are doing with the little Ryzen Max or whatever they are called SOC's for the small form factor AI boxes.

I'd love to see someone try gaming on one of those! :) I don't think it will be great. But I DO think that will be the path forward for a new GPU solution and for consoles. (GPU solution being a slice of that in effect but scaled way up.)

And yea... I just kind of imagined a GPU for consumers with 128 gig of 'vram'. lol... not gonna happen but it popped in my head.
 
Console's will never compete with modern high end GPU's. They've always used software trickery to get them to run at certain resolutions and frame rates. They can't pack that much power in to a small package without it causing issues thermally or power wise.
 
Console's will never compete with modern high end GPU's. They've always used software trickery to get them to run at certain resolutions and frame rates. They can't pack that much power in to a small package without it causing issues thermally or power wise.
If you undervolt a 5080 how low can the power consumption go ?
 
don't know, never tried it. But performance will fall off a cliff as well
Actually, depending on how low you go, performance difference is minimal, that's why undervolting is becoming very popular. But even if you undervolt power will rise automatically depending on load. So power draw will still be high, just not htat high.
 
A more "realistic" expectation is taking say a mobile RTX 5000. Sure, you could fit a mobile RTX5090 in a console for desktop level RTX 4090 performance, but it would be too d*mn expensive.
 
Become a Patron!
Back
Top