AMD RDNA 4 GPUs Are Rumored to Feature a Reworked Ray-Tracing Hardware Solution

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
8,924
Points
113
A new rumor is that AMD RDNA 4 GPUs will introduce a different ray tracing solution than what was used in RDNA 2 and 3. Multiple media outlets have interpreted a couple of tweets from well-known hardware information leaker Kepler_L2 as meaning that AMD has reworked its approach to ray tracing with its upcoming RDNA 4-based GPUs. Now while this is a rumour it does seem plausible given AMD's continued weaker performance with ray tracing when compared to its competitors.

See full article...
 
I've seen these rumors about - they're still very much just rumors, but it's clear that AMD needs to take a sincere swipe at a real RT performance improvement if they're to stand a chance at competing with Nvidia going forward.

I'm betting that the improvement was at least partly (if not almost wholly) driven by console manufacturer's desires to have more comprehensive RT capability. RT along with HDR can produce absolutely stunning results!
 
Nvidia is expected to incorporate AI even more into RTX to further improve IQ/performance, so anything AMD can do to improve RT performance is more than welcome, but I really don't see them matching, much less overtaking nvidia. But hey, stanger things have happened.
 
I've seen these rumors about - they're still very much just rumors, but it's clear that AMD needs to take a sincere swipe at a real RT performance improvement if they're to stand a chance at competing with Nvidia going forward
Don't forget intel, while their GPU's are somewhat lacking still, their RT performance is pretty decent
 
Don't forget intel, while their GPU's are somewhat lacking still, their RT performance is pretty decent
If intel manages to get their sh*t together, they could be a real contender. Both its AI and RT engines are quite capable, it's the overall performance that's lacking. Coincidentally, intel could very well fight AMD if neither can face nvidia at the high end.
 
Don't forget intel, while their GPU's are somewhat lacking still, their RT performance is pretty decent
I think next-gen hardware will be a tough race across all three except at the very highest end, unless Intel wants to push that far (and they might).

Intel's weakness will remain pre-DX12 stuff and stuff that is just too obscure / obtuse to be prioritized for optimization.

I will say one thing about their GPUs though - having working HDMI CEC makes them desirable for use with a TV over AMD and Nvidia. Of course, the TV folks could fix that with a true 'monitor mode' if they cared. Ideally, we'd get both on both ends.
 
I've seen these rumors about - they're still very much just rumors

Agreed. Fans always read too much into rumors. (Never was this worse than pre Bulldozer launch in 2010-2011_ We will know the truth when the product launches and we get independent testing.


but it's clear that AMD needs to take a sincere swipe at a real RT performance improvement if they're to stand a chance at competing with Nvidia going forward.

Totally agree. And if true, this is good news.

They needed to do this.

I'm betting that the improvement was at least partly (if not almost wholly) driven by console manufacturer's desires to have more comprehensive RT capability. RT along with HDR can produce absolutely stunning results!

I sortof agree, but not 100% with this part.

I still tend to believe we didn't need RT. That well made games with good good shadow maps and other tricks to improve raster rendering looked great. Almost as good as current RT.

Note how I used past tense.

I have noticed that since RT has become an option, game devs have been using it for higher graphics settings, and when you turn it off, raster only render modes - at least from a lighting and reflection perspective - often look worse than they used to before the RTX craze.

I suspect it is easier for devs to do it in RT than to spend time with shadow maps and other hacks to make it look right in raster.

So in the end, most of RT is about developer convenience, not about improving things for the end user. We get to eat more computationally intense render paths so that devs don't have to spend time with raster workarounds.

All of that is neither here nor there though. RT is here now, and it is here to stay, and AMD has been behind on it ever since Nvidias 20-series launched. The 7000 series is much better than the 6000 series was (my 6900xt went from playable to slide show when I enabled RT in Cyberpunk, for instance) but they still have a long way to go.

If this rumor is accurate, I hope they succeed and close the gap some. It is so annoying that Nvidia always tries to pull another trick out of their hat to carve out a niche that only they do, and then push the entire industry to adopt it so they become the only good option, rather than competing on performance. It distorts the market

If this rumor is true, I hope it closes the gap making AMD products relevant again for those of us who crave high IQ and high resolution.
 
Agreed. Fans always read too much into rumors. (Never was this worse than pre Bulldozer launch in 2010-2011_ We will know the truth when the product launches and we get independent testing.




Totally agree. And if true, this is good news.

They needed to do this.



I sortof agree, but not 100% with this part.

I still tend to believe we didn't need RT. That well made games with good good shadow maps and other tricks to improve raster rendering looked great. Almost as good as current RT.

Note how I used past tense.

I have noticed that since RT has become an option, game devs have been using it for higher graphics settings, and when you turn it off, raster only render modes - at least from a lighting and reflection perspective - often look worse than they used to before the RTX craze.

I suspect it is easier for devs to do it in RT than to spend time with shadow maps and other hacks to make it look right in raster.

So in the end, most of RT is about developer convenience, not about improving things for the end user. We get to eat more computationally intense render paths so that devs don't have to spend time with raster workarounds.

All of that is neither here nor there though. RT is here now, and it is here to stay, and AMD has been behind on it ever since Nvidias 20-series launched. The 7000 series is much better than the 6000 series was (my 6900xt went from playable to slide show when I enabled RT in Cyberpunk, for instance) but they still have a long way to go.

If this rumor is accurate, I hope they succeed and close the gap some. It is so annoying that Nvidia always tries to pull another trick out of their hat to carve out a niche that only they do, and then push the entire industry to adopt it so they become the only good option, rather than competing on performance. It distorts the market

If this rumor is true, I hope it closes the gap making AMD products relevant again for those of us who crave high IQ and high resolution.


It IS easier to do raytracing than using every single magic trick with shaders. With Path Tracing you get pretty much every light related effect for "free" like global illumination, ambient occlusion, reflections, ray traced shadows, and a long etcetera. The problem is performance. But we are getting there.

BTW I'm trying RTX remix and even in its current state, is so much easier to use and powerful than ReShade ever was.
 
But hey, stranger things have happened.
Not really.

Very rarely has AMD been able to match NVIDIA much less surpass them. The one time it did so solidly, it had purchased another company to get the technology to do so. It's really not done so ever since.
 
Not really.

Very rarely has AMD been able to match NVIDIA much less surpass them. The one time it did so solidly, it had purchased another company to get the technology to do so. It's really not done so ever since.

If memory serves me well, the Radeon 8500 put the first dent on nvidia's Geforce3. (Still loved my Ti200). The Geforce 4 regained the crown but was obliterated by the Radeon 9700 (still loved my Ti4200, but not as much) and then the 9800 became the undisputed king, not even the 5900 (incidentally one of my favorites), much less the fiasco that was the 5800 could match it. It took the 6800 to finally set things straight. After that they traded blows for a few generations. It wasn't until the GTX8800 when nvidia clearly had the top spot and hasn't left it since, there has been a couple of bumps here are there, but nvidia still reigns supreme.
 
If memory serves me well, the Radeon 8500 put the first dent on nvidia's Geforce3. (Still loved my Ti200). The Geforce 4 regained the crown but was obliterated by the Radeon 9700 (still loved my Ti4200, but not as much) and then the 9800 became the undisputed king, not even the 5900 (incidentally one of my favorites), much less the fiasco that was the 5800 could match it. It took the 6800 to finally set things straight. After that they traded blows for a few generations. It wasn't until the GTX8800 when nvidia clearly had the top spot and hasn't left it since, there has been a couple of bumps here are there, but nvidia still reigns supreme.
The 5800 and the 5900 were the same basic fiasco...

Didn't help that ATi had the reference implementation of DX9, of course. But as they were dying and being sold to AMD, Nvidia took over with DX10, and well, AMD has been catching up one way or another (raw performance, feature support, drivers...) ever since.
 
The 5800 and the 5900 were the same basic fiasco...

Didn't help that ATi had the reference implementation of DX9, of course. But as they were dying and being sold to AMD, Nvidia took over with DX10, and well, AMD has been catching up one way or another (raw performance, feature support, drivers...) ever since.

I'm sure everyone will agree that the 9800 was better than the FX5900, but actually the 5900 was a resounding success, it was a top seller for nvidia.

I had mine flashed to FX 5950ultra and it was great.
 
I'm sure everyone will agree that the 9800 was better than the FX5900, but actually the 5900 was a resounding success, it was a top seller for nvidia.

I had mine flashed to FX 5950ultra and it was great.
I'd compare it to the preceding and following generations to determine success, though I admit that I don't have the data to know one way or another.

Just that the stink of the FX5800 followed the generation through ;)
 
Not really.

Very rarely has AMD been able to match NVIDIA much less surpass them. The one time it did so solidly, it had purchased another company to get the technology to do so. It's really not done so ever since.

It's kind of weird that AMD (ATI) bought ArtX technology and it was the basis for their next generation cards then on.
while nvidia bought 3dfx but pretty much nothing from it was incorporated into later projects.
 
I suspect it is easier for devs to do it in RT than to spend time with shadow maps and other hacks to make it look right in raster. So in the end, most of RT is about developer convenience, not about improving things for the end user.
For one example, see the following video, from 22:57 to 27:40:

With Path Tracing you get pretty much every light related effect for "free" like global illumination, ambient occlusion, reflections, ray traced shadows, and a long etcetera.
Yeah, all those things (and other stuff like light refraction and distortion through transparent objects) just happen naturally because of the realistic light physics simulation, and this produces far more accurate results than faking them, as has traditionally been done with rasterization. There are also other benefits, for example, light emanating from a light source actually matches the shape of the light source. With the traditional setup, a tube light bulb would still only have a point light applied to the center of it, and this light radiates out in a sphere from that center point. But with ray tracing, the entire length of the light bulb is the light source, and the light radiates out from that entire surface.

One thing I've been pretty happy about with RT is the lack of screen-space reflections. It was always kinda annoying and distracting how with SSR only objects that are shown on-screen would appear in reflections, and as the camera moves and those objects are no longer on the screen, their reflections disappear. But with RT, everything that exists in the environment is shown in reflective surfaces, regardless of whether they are visible on-screen. Just another thing that improves the immersion factor.

For me a lot of the wow factor with path tracing has come not from how things look, but with how light (and light-related stuff) behaves. It's been fun playing around with the light physics in games like Metro Exodus EE, Quake 2 RTX, Portal RTX, and CP2077. I can't wait to see the effect on immersion when games are designed with path-tracing from the ground up. In the far off distant future when GPUs no longer choke on RT workloads.

It's kind of weird that AMD (ATI) bought ArtX technology and it was the basis for their next generation cards then on.
And it was around that time that ATi's ArtX was used for the GPU in Nintendo's 6th-gen console the Gamecube.
 
If intel manages to get their sh*t together, they could be a real contender. Both its AI and RT engines are quite capable, it's the overall performance that's lacking. Coincidentally, intel could very well fight AMD if neither can face nvidia at the high end.
But their drivers are always 190% faster every time they release a new one!
 
Become a Patron!
Back
Top