AMD Could Be Prepping a Flagship Next-Gen GPU Featuring 96 Compute Units with Memory on a 512-Bit Bus

Even if raster is strong, RT and feature parity matter at the high end.
AMD can solve RT at any time - it's a choice of resources and product planning (and subsequent product positioning).

But the software issues will persist. I'm almost hoping that for upscaling we see a unified API; there was an effort by Microsoft, but I haven't seen much lately.

The further problem is that once you move outside of gaming, you're in a crapshoot of support if you're not running Nvidia. One driver or app update and you're taking real risks of breaking your workflow, which is unacceptable at the professional level and frustrating for consumers (and I've dealt with it personally with AMD).

And I'm still surprised that AMD has trouble with Adobe apps off and on.
 
Yup.

Plenty of nay-sayers when the RTX2000-series hit too; yet here we are with consoles employing it, and phones are likely to base their rendering on RT within the next half decade.
I'm still a naysayer.

Full blown 100% RT is still not real-time. What you are seeing now are some mud puddle reflections and shiny chrome on guns pasted on top of a rasterized scene.

The only full RT-calculated games right now are tech demos and ports of ancient titles.
 
grok summary of high yield YT's comments

In Broken Silicon podcast episode 351,
@highyieldYT
(guest Max) states that path tracing is currently only worth it in four specific games, while in others, it doesn't add much visual value.

These games are discussed around the 1:22:00 to 1:29:00 mark (under "RDNA 5 Performance," "PS6 Ray Tracing," and "PSSR 2 & RE9" timestamps), where path tracing enhances realism noticeably:
  • Cyberpunk 2077: Full path tracing mode is a standout, turning up the "realistic knob" for immersive lighting.
  • Alan Wake 2: Leverages path tracing artistically for exceptional light effects.
  • Indiana Jones and the Great Circle: Highlighted for its path tracing implementation.
  • Resident Evil Requiem (RE9): Noted for path tracing support, with FSR4 discussions tying into its performance.
He says: "aside from that um all the other games that have path [tracing] in my opinion are not that great right now or like path [tracing] doesn't add much."

This comes from the YouTube video linked in the original X post , focusing on PS6 ray tracing and RDNA 5 improvements that could make path tracing more viable in future hardware.

 
I'm still a naysayer.

Full blown 100% RT is still not real-time. What you are seeing now are some mud puddle reflections and shiny chrome on guns pasted on top of a rasterized scene.

The only full RT-calculated games right now are tech demos and ports of ancient titles.

They are still using the hybrid approach, but there are games that support path tracing like CyberPunk, Alan Wake 2 and Black Myth: Wukong and older games like Quake2 RTX and Portal use full path tracing, so we are getting there.

Take a look at the background scenes in shows like the mandalorian or the sphere in Vegas, they are fully path traced in real time using hundreds of professional nvidia cards at insane resolutions. It's only a matter of time we see it on consumer cards.
 
You are assuming we will still have consumer cards.

Yes, there will be, IF nvidia were to forfeit the gaming market, they would bail from plenty of other markets where it's the dominant player.

worst that can happen is someone else will take its place.
 
I'm really having doubts about high end cards moving forward due to prices the power needs. Unless they can bring both down it is getting to a point where most of us won't be able to afford or operate one.

However on the budget side of things we're getting to the point where APUs or iGPUs are within arms reach and so it they're almost pointless. I think the middle tier will be here for a while though but if DRAM prices force them to idiotic price levels it won't matter and that seems to be the main reason the SUPERs and Intel's Big Battlemage are nowhere to be seen right now.
 
I'm really having doubts about high end cards moving forward due to prices the power needs. Unless they can bring both down it is getting to a point where most of us won't be able to afford or operate one.

However on the budget side of things we're getting to the point where APUs or iGPUs are within arms reach and so it they're almost pointless. I think the middle tier will be here for a while though but if DRAM prices force them to idiotic price levels it won't matter and that seems to be the main reason the SUPERs and Intel's Big Battlemage are nowhere to be seen right now.
I don't have faith in Intel's ability to compete at a higher level in the dGPU market. Not when Koduri had his hands in it. Everything that guy touched turned to hot dookie. I don't know who's running the GPU show for Intel now, but they've haven't really done much over the past year and a half.
 
Whether Intel pulls their collective heads out in terms of innovation is directly related to whether they can get their fabs running profitably.

They're subsidized by the USG now, so that's actually more in question than it usually is.

Otherwise, they would just brute force their way into decent marketshare.
 
Become a Patron!
Back
Top