Radeon RX 6800 XT Cyberpunk 2077 Ray Tracing Performance

It is obvious that the developers designed around RTX hardware and not Amds for RT.
If there was any design target, it would have been RTX because that's all anyone had for years. Which means that the issue was AMD not designing their hardware and drivers to meet the needs of games, something both companies have failed to do on occasion when a new technology becomes available.

To be fair to AMD, their first-gen RT is probably better than Nvidia's 2000-series.
Unless you want to count the specifically written Nvidia denoiser for ray tracing that they have created. Not sure what magic this works behind the scenes in coding but it is nvidia specific to take advantage of specifically gated hardware. Not that I'm against that just saying it isn't all roses. ;)

Noted here: https://www.realtimerendering.com/raytracing.html

And also on nvidia's page. Here is the quote from the source.

"Denoising is critical for real-time DXR performance when using path tracing or other Monte Carlo techniques. Alain Galvan's summary posts on ray tracing denoising and machine-learning denoising are good places to start. Zwicker et al. give a state of the art report about this area; note that it is from 2015, however, so is not fully up to date. Intel provides free code in their Open Image Denoise filter collection. The Quake II RTX demo includes shader code for the A-SVGF filter for denoising. NVIDIA has a developer access program for their denoiser. "
If the blame is focused on the denoisers, then wouldn't it be that AMDs denoiser is less efficient?

I don't really see the big deal here. Nvidia had been working to find a way to bring RT to real-time consumer graphics for near a decade and brought their solution to market a few years earlier, and it's very reasonable for games to appear to 'favor' the platform that was actually available during their development.

You might recall how painful the release of DX9 was for Nvidia :cool:
 
How are they designing it around RTX hardware when they're using the Microsoft DXR API?

There's no custom code that's done to differentiate between the red/green side. I suppose the only thing could be that developers chose to implement a RT method that is known to perform poorly on AMD hardware, but if it gives them the visual effect that they are looking for within the game, that's not necessarily a decision based upon performance but rather artistic direction.
It is more to do with the enhancements of DXR 1.1 which AMD and Microsoft worked together on, Nvidia also contributed. Most likely the optimizations that work well in RNDA2 did not make it in Cyberpunk 2077 since DXR 1.1 was much later. AMD video on how to optimize for RNDA2 plus it covers briefly optimized denoiser made by AMD:


On the XBox Series X, Rachet and Clank is pushing 60 FPS, 4K resolution and Raytracing (reflections). Game to me is obviously well optimized to take advantage of RNDA2:


This second video has developer talking about RT, if you can't watch the first due to time this will probably be best, covers many aspects of the game:


I just don't think Cyberpunk 2077 represent well what AMD can do with RT at this stage.
 
Last edited:
Personally I see things as AMD turning a lot of things around in the design of their chips and chiplets be they for Primary CPU or GPU builds. They've clearly established a topology that is working for them and they have thus far successfully iterated on it in the CPU front, and now are working to bring that same iterative development improvement to the GPU front.

No company today can safely sit on their Laurels EVEN with the enhanced demand that computer parts are currently under.

This is good for the market... and will be even BETTER once pricing turns the bend back to the levels of sanity. Though we all know it will won't completely return to 'pre covid' normal for a very long time.

And yes each company has tips/tricks/code to best take advantage of their hardware. Be that through driver plugin's denoisers, or flat out building their hardware to code targets.

The advantage AMD has is most games for the next 3-5 years will be specifically targeted to best take advantage of their hardware. As long as they can continue to iterate and optimize those familiar paths they will have success.

The advantage Nvidia has is they are first to the party and have the financial capital to fund development to best take advantage of their hardware, literally more than any other hardware developer today.

AMD is probably taking a loss short term on the chips and design work they are doing for Sony and Microsoft consoles. Not to mention the Samsung partnership. BUT, on the back of the same token Nvidia is more interested in big compute where the larger profit margins are. Not that they are NOT interested in consumer market.

The likes of Nvidia should be worried about Apple entering the desktop chip design. 1 ubiquitous api platform for all compute... That's dangerous for all of the players other than Apple, and Apple has the financial might to try and make that happen.
 
I just don't think Cyberpunk 2077 represent well what AMD can do with RT at this stage.
You do know that this is because AMD RT didn't exist when CDPR implemented ray tracing, right? And that Intel is going to have the same problem, as will Apple and anyone else trying to step up to the plate?

And this is why...
The advantage Nvidia has is they are first to the party and have the financial capital to fund development to best take advantage of their hardware, literally more than any other hardware developer today.

Though Intel and Apple are very much likely to be nipping on their heals, from a capital standpoint. AMD still has a significant advantage in terms of having a head start, but they're still a generation behind Nvidia too.

Personally I see things as AMD turning a lot of things around in the design of their chips and chiplets be they for Primary CPU or GPU builds.
GPU chiplets should be easy enough after getting CPU chiplets working. Like RT, though, and many other things, they haven't done it till they've done it.

If (when?) they do pull it off, I'm pretty excited about the possibilities. Performance can raise pretty easily without cost skyrocketing.

The advantage AMD has is most games for the next 3-5 years will be specifically targeted to best take advantage of their hardware. As long as they can continue to iterate and optimize those familiar paths they will have success.
This most recent console generation has been the least potato out of the box, so this generation might actually be different, but it should still be mentioned that this has yet to really result in an advantage. There's always just a bit of difference between the console stack and the desktop.
 
Become a Patron!
Back
Top