AMD Could Be Prepping a Flagship Next-Gen GPU Featuring 96 Compute Units with Memory on a 512-Bit Bus

Please note: Kepler suggests that current WGP (2 CUs) could be merged to a single CU (Effectively CU reduces by half)


it looks like that the RDNA 4 lineup may indeed feature four configurations according to Chiphell forum member ZhangZhonghao, who has been very accurate with his past leaks. He alleges that the top-most die would be very large, followed by a mid-tier, a small-tier and a tiny-tier SKU.

Kepler in AnandTech forum has now posted new information, well, essentially block diagrams of at least four RDNA 5 / UDNA SKUs at Anandtech Forums.

Potential AMD RDNA 5 / UDNA GPU Configurations (via Kepler_L2):​

GPU DieNavi 5XNavi 5XNavi 5XNavi 5X
PositioningFlagship-TierMid-TierLow-TierEntry-Tier
Max Compute Units96 CUs (40 CUs24 CUs12 CUs
Max Memory Bus512-384 bit384-192 bit256-128 bit128-64 bit
Max VRAM Capacity24-32 GB12-24 GB8-16 GB8-16 GB


https://wccftech.com/possible-amd-rdna-5-udna-gpu-sku-configs-point-to-96-40-24-12-cu-dies/
 
I don't doubt AMD is probably working on something like this. What I doubt is that it will perform as well as anything NVIDIA currently has or will have at the time of release. AMD has been well behind NVIDIA for several product generations now.
 
I don't doubt AMD is probably working on something like this. What I doubt is that it will perform as well as anything NVIDIA currently has or will have at the time of release. AMD has been well behind NVIDIA for several product generations now.
Ehhhh... I mean at the tiers they have chosen to compete at they've been great. But they haven't competed at anything approaching high end last gen at all.
 
Ehhhh... I mean at the tiers they have chosen to compete at they've been great. But they haven't competed at anything approaching high end last gen at all.
At the risk of repeating previous escapades, being behind in RT and upscaling is truly 'behind' for AMD. Outside of a hard budget, most folks would spend another 10% just to get those features locked in.

Now there's nothing stopping AMD from supercharging their RT hardware, but getting FSR4 widely adopted is on the political side where they've regularly faltered. And they're still not 'there yet' in terms of creativity support, which may be much of the same thing.
 
Call me when they release something that causes Nvidia to lose market share.
 
Am I the only one who still wants better raster performance, no RT, and no frame generation?
We're rapidly moving towards a raytracing only future, and honestly I think engines are going to be a lot better for it.

If you dig into the Unreal source code, the overhead for supporting so many different permutations of features is huge. Non-Lumen Vs Lumen, HWRT Vs SWRT options for both, old fashioned shadow maps Vs virtual shadow maps Vs Megalights, etc. Hell, they still support baked lighting (and added baked lighting support to World Partition not too long ago, I think 5.4? 5.5?). There's a mountain of cruft, and something needs to give.

As of UE5.6 they're turning on hardware raytracing by default in new projects, and recommending to use it on console (even targeting base PS5/Xbox). I fully expect that Unreal 6 will be fully HWRT only, and to see them delete support for a bunch of their legacy rendering modes. That way they can clean up their code, let their engineers focus on making the RT code paths better, and simplify their code dramatically.
 
Am I the only one who still wants better raster performance, no RT, and no frame generation?
Probably!

RT replaces the massive accumulation of hacks that raster requires to do lighting, while doing it essentially perfectly versus the mess that is raster.

Frame generation has it's uses, and while not free, it is very cheap versus the end result.

The one you didn't mention, upscaling (DLSS/FSR/XeSS), is huge. The gain is tremendous in terms of performance, but also in terms of displacing AA techniques like TAA that degrade image quality.
 
They may not want to (get involved In a market share war with the Goliath)

Yeah, Intel was once a Goliath compared to their CPU market share, too. (Well technically Intel is still a Goliath, just a not as big as it was ~7 years ago)
 
Yeah, Intel was once a Goliath compared to their CPU market share, too. (Well technically Intel is still a Goliath, just a not as big as it was ~7 years ago)
I mean this. but lets be real honest here. AMD took a gamble with went with a new direction for CPU design and scaling and it worked for them. They did it on their own. Now Intel is trying to play catchup. Everyone is bemoaning Intel but they are still a powerhouse in the datacenter and in consumer space to be honest. They will do just fine long term unless they make some tragically bad decisions.
 
Am I the only one who still wants better raster performance, no RT, and no frame generation?
I agree with you. Most of the games I play are older and don’t support any of this anyway

I won’t deny frame generation can give you better frame rates - but I don’t use it when it’s an option as the tradeoff in image quality (typically in the guise of random weird glitches or smearing) usually isn’t worth it.

As far as RT being better for lighting. Sure. But that isn’t exactly earth shattering to me, and it tends to be something I can barely notice in still screen shots, not while I’m playing

Maybe I’m just yelling get off my lawn. But I’m ok with that. RT performance and Frame Gen features are not something I consider when looking to make a purchase.
 
Another thing to consider is that we've more or lest hit 'peak raster'. Screaming MORE DOTS at the display doesn't really do anything anymore, right?

I mean, there will be improvements. But going to RT means lowering the shader load at the same time for newer games.

And we're not really pushing resolutions and refresh rates on the monitor side that hard for the average user. 4k has been the 'high end' target for a decade, and the only real reason to go higher is for pixel density on larger displays, and the only reason that is needed is due to awkward sub-pixel configurations on the likes of OLED panels (or if you're special, like Apple).
 
I mean this. but lets be real honest here. AMD took a gamble with went with a new direction for CPU design and scaling and it worked for them. They did it on their own. Now Intel is trying to play catchup. Everyone is bemoaning Intel but they are still a powerhouse in the datacenter and in consumer space to be honest. They will do just fine long term unless they make some tragically bad decisions.

Yes you are right. That's why I put what I did in parenthesis.

However, that gamble, for all intents and purposes, is how industries grow and innovate.

AMD couldn't beat Intel at their own game/plan. So they went a different direction so to speak.
They'll have to do the same with Nvidia.
 
Yes you are right. That's why I put what I did in parenthesis.

However, that gamble, for all intents and purposes, is how industries grow and innovate.

AMD couldn't beat Intel at their own game/plan. So they went a different direction so to speak.
They'll have to do the same with Nvidia.
Kind of makes me wonder

Yeah, AMD rolled the dice with chipsets and won.

But that by itself isn’t want put AMD where they are now. It ultimately took Intel getting stuck and making several missteps for AMD to really start gaining traction.

If Intel had continued with their trajectory from the mid 00 and early 10s, even with the best that Zen could do I think Intel would have continued to dominate market share.

I guess what I’m saying is that it wasn’t so much that AMD had this breakthrough product as it was Intel falling down.

And I think it will be the same case on graphics. AMD doesn’t need to hit a home run here with some crazy gamble. They just have to get relevant and stay decent, and hope that people notice that nVidia has pivoted away from the gaming market almost entirely …. And wait for the stumble

You won’t pull nVidia fan boys away with one crappy nVidia generation. But give it four or five in a row where the competition looks compelling and they will start to change color. Thats more or less the same thing I saw with Intel - it took several successive generations of AMD being more competitive and compelling for the die hards to finally admit they are willing to jump the fence

As far as I’m concerned every nVidia release since RTX has been overpriced and underwhelming, relying on vendor locked (and often generation locked) frame generation to be really competitive for the price tier. But I also admit if you want high end PC enthusiast level, no one else is even playing in that market right now except nVidia
 
Last edited:
Become a Patron!
Back
Top