AMD Could Be Prepping a Flagship Next-Gen GPU Featuring 96 Compute Units with Memory on a 512-Bit Bus

I think it would make more sense for AMD to leave the pc gaming market. They already own the console market and will continue to dominate in the foreseeable future. Contrary to what fanboys may think, AMD is NOT doing well in the PC gaming market, it had like 20% market share in 2025, it costs them a lot of money to stay there and it seems no matter what they do, they always come second.
 
I think it would make more sense for AMD to leave the pc gaming market. They already own the console market and will continue to dominate in the foreseeable future. Contrary to what fanboys may think, AMD is NOT doing well in the PC gaming market, it had like 20% market share in 2025, it costs them a lot of money to stay there and it seems no matter what they do, they always come second.
I disagree.

They're still ahead of Intel, and they still have a tremendous amount of brand capital among gamers as well as developers. Granted they could use more, but walking away means throwing that investment away, in a market that will always be growing.
 
in a market that will always be growing.
is it though?

All the growth in gaming seems to not be in areas where you need high performance PCs.

Just going off a hunch - I would bet its still in mobile F2P titles with whales, and titles intended to run on a potato and sell funny hats - like Fortnight and TF2

You don’t need discrete GPUs for any of that.
 
I disagree.

They're still ahead of Intel, and they still have a tremendous amount of brand capital among gamers as well as developers. Granted they could use more, but walking away means throwing that investment away, in a market that will always be growing.
You are talking about nvidia, right? seems more fitting.
 
I think it would make more sense for AMD to leave the pc gaming market. They already own the console market and will continue to dominate in the foreseeable future. Contrary to what fanboys may think, AMD is NOT doing well in the PC gaming market, it had like 20% market share in 2025, it costs them a lot of money to stay there and it seems no matter what they do, they always come second.
overall volumes should stay high for RDNA 5 (once you count the 'xbox' chips sold by Microsoft & its board partners)

RDNA 4 is the last stand alone graphics gen by AMD

from RDNA 5 onwards it is the scraps of console left overs
 
Last edited:
Microsoft has contracted AMD for xcloud gaming hardware chip which can virtualize & run 8 series s2 instance at one go
series s2 = ps5 pro
series x2 = triple series s2

AMD is expected to cut that chip by 25% for top xx90 card

so basically 10090xt should be equal to 6x 9060xt or 6x PS5 pro

So they're saving the best chips for cloud gaming. Great. Granted they're meant for multiple instances so that might be overkill for consumer (especially when limited on power budget). But given AMD's history with halo products (or lack thereof), what the market needs from them is a little bit of overkill.

Yeah, just like 1080p wasn't ever to be a market... :rolleyes: ;)

4K is quickly becoming the new standard if only because of consoles. There are lots of 4k gaming capable TVs with high refresh rates that you can use for pc gaming. I wouldn't touch a 1440p monitor if my life depended on it.

You are completely missing out on every point I made. RT without framegen and upscaling slows everything to a crawl. There's no way 4k is a standard with RT minus framegen/upscaling.

You are not getting 144 minfps in anything at 4k even without RT on a psuedo-modern raster engine (UE5). It's not happening, not even with a 5090. In many games a 5090 will struggle to achieve those framerates even at 1440P! Just because you have a 4k 144hz TV doesn't mean it'll make your games run at that speed.

"Lightweight" RT is absolutely possible up and down the spectrum. That is evidenced by console games using RT.

See above, RT is only possible with cheap hacks that harm image quality and/or user experience in other ways.
 
See above, RT is only possible with cheap hacks that harm image quality and/or user experience in other ways.
Then why exactly are we pushing RT so heavily?

I mean, if I'm honest, apart from those ridiculous reflections on mud puddles in the CyberPunk trailers... I'm not really seeing the vast image quality benefit over more traditional light maps. It may be easier from a developer point of view, as they don't have to bake light maps by hand... but that's a Developer issue, not a Me issue.
 
I do hope they produce a proper halo product, halo products help, always have, its dumb to ignore them. That said... Idk it kind of has to be a winner, which aint easy.
 
I do hope they produce a proper halo product, halo products help, always have, its dumb to ignore them. That said... Idk it kind of has to be a winner, which aint easy.
I kind of agree.. I have the 7900xtx and have for a while now... it feels long in the tooth hence why I was looking at 5080's at the time of price creep again. I'd like to see another proper HALO card from AMD, but I have my doubts they will commit.
 
I'd like to see another proper HALO card from AMD, but I have my doubts they will commit.
You do have the choice of getting their improper R9700 32GB card :)

Should be a nice improvement over your XTX and then sell that when the prices go nuts to recoup some of the cost back.
 
You do have the choice of getting their improper R9700 32GB card :)

Should be a nice improvement over your XTX and then sell that when the prices go nuts to recoup some of the cost back.
That's a downright steal if you're into AI models... and your application is amenable to AMD, of course.
 
Then why exactly are we pushing RT so heavily?
It was a feature-differentiator for NV on their 2000-series cards. Plus rendering nerds have been wanting RT for ages (for whatever reason). Developers like it because no more lightmaps (see: games with RT-only rendering, such as Doom Dark Ages).
 
Based on the specifications, the RDNA 5 series is likely to encompass a full range:
  1. a flagship with 96 CUs,
  2. a mid-range model with 40 CUs,
  3. a mainstream one with 24 CUs, and
  4. an entry version with 12 CUs.
Bus widths decrease alongside memory capacities, creating a typical tiered approach.


The pivotal question remains whether the flagship tier chip will actually be released in a gaming configuration.

Should the AT0 debut as a gaming card, its significance would primarily lie in testing the architectural limits, not in direct sales. The core's ability to sustain an acceptable power density, frequency range, and cooling requirements will determine whether it stands as a noteworthy symbol or integral part of a sustainable product lineup.

The pricing of AT0, if released as a gaming chip, will be crucial. A previously mentioned price of over $2000 is not consistent with AMD's recent pricing strategies. Venturing into this range directly challenges the RTX 5090 tier, as well as involves higher wafer costs and reduced shipping flexibility. Large chips are more susceptible to yield variations in advanced manufacturing processes, where minor defects can significantly affect the number of sellable units.

The chip's design echoes that of the Radeon VII, the Vega 7nm chip initially aimed at the HPC market but later adapted as a gaming card released in limited numbers. Typically, large chips enter the consumer sector not because of gaming demand but due to pre-existing wafer inventory, yield windows, or product timing needing a channel. If AT0 follows a similar trajectory, its justification is more rooted in the chip's development course rather than sheer market demand.


there should definitely be a professional version of this chip as explicit demand is already there
👇
 
It was a feature-differentiator for NV on their 2000-series cards. Plus rendering nerds have been wanting RT for ages (for whatever reason). Developers like it because no more lightmaps (see: games with RT-only rendering, such as Doom Dark Ages).
Actually, it was intel who tried bringing Raytracing for gaming first with Larrabee.
 
Never heard that before, but functionally that never happened since Larrabee never made it to market as a dGPU.
IIRC both nvidia and intel were working on Raytracing cards and Intel's Larrabee was first; but it never made it to market. I recall there were a couple of RT demos on it. nvidia also had a RT demo that ran on pascal but at seconds per frame instead of frames per second :rolleyes: :rolleyes: IIRC it was the StarWars demo runing on 4 titan pascal cards.
 
Still amazes me how far ray tracing has come. I remember in the late 80's / early 90's to ray trace one image would take a week or more depending on how detailed the image was and the power of the computer doing it. And that was like 800x600 resolution.
 
Still amazes me how far ray tracing has come. I remember in the late 80's / early 90's to ray trace one image would take a week or more depending on how detailed the image was and the power of the computer doing it. And that was like 800x600 resolution.
Yup.

Plenty of nay-sayers when the RTX2000-series hit too; yet here we are with consoles employing it, and phones are likely to base their rendering on RT within the next half decade.
 
Still amazes me how far ray tracing has come. I remember in the late 80's / early 90's to ray trace one image would take a week or more depending on how detailed the image was and the power of the computer doing it. And that was like 800x600 resolution.

I recall doing raytracing on the Amiga, even the simplest scene would take days to render at 320x200 and don't even get me started on 640x480 ultra high res... :LOL: :LOL:
 
Become a Patron!
Back
Top