Radeon RX 6800 XT Cyberpunk 2077 Ray Tracing Performance

Brent_Justice

Administrator
Staff member
Joined
Apr 23, 2019
Messages
773
Points
93
banner-8.png




Introduction



It’s here! It’s here! It finally came! We now have Ray Tracing support in Cyberpunk 2077 from cd Projekt Red on AMD Radeon RX 6000 series video cards! It only took more than 6 months for that support, something that should have been ready at the game launch. With the new patch 1.2 Ray Tracing support for Radeon RX 6000 series has been added. This means you can turn on Reflection, Shadow, and Lighting Ray Tracing in Cyberpunk 2077 on the Radeon RX 6900 XT, Radeon RX 6800/XT, and Radeon RX 6700 XT.







This new
Continue reading...
 
Last edited:
So basically, AMD's RT performance is a swing and a miss for its first attempt. I'm glad they are there but they need to take another stab at the way they go about it, cause the first go-around ain't all that pretty with respects to performance.

Thanks for the look into it Brent. Job well done!
 
Im still waiting to buy a GPU that has ray tracing......so it's really a moot point.

AMD needs help with this, it appears.
 
I'm not too concerned.

First, as cool as Raytracing is - I don't play a single game that supports it yet, and even of the ones I am looking at getting, yeah... it would be single digit numbers for the near term. So it's nice, but it's far and away not my primary concern. Same thing with DLSS honestly - nice feature, wouldn't turn it down - not paying extra. Traditional rasterizing performance is still my primary concern.

For the same amount of money - yeah, I'd rather have better RT than worse RT. But I'm not going to go set out to spend more money just for the sake of better RT. The offerings, at MSRP, between nVidia and AMD - break out a bit oddly for me. The 6700 doesn't make a lot of sense at all, the 6800 is a rock star, the 6900 not really, the 6800 XT on the fence.... but that's at MSRP, all other things equal.

Second - yeah, can't buy any of these anyway, from any manufacturer. So ... like @magoo says, moot point. Not to say this isn't a good writeup. It's great info to have. It just won't influence any of my near purchases because... there's nothing out there to purchase.

Third - playing devil's advocate: it isn't charted, but I imagine it clocks in about the same as nVidia's first stab at hardware accelerated RT. Granted, that only goes so far - you can't really compare two different gens, otherwise we'd be talking about how much better the 11900 is than Bulldozer - just that I didn't have very high expectations and those expectations were not exceeded by AMD.

If you really truly care about Raytracing, I guess your choice is clear cut. For the same price and availability, I'd rather have it than not, I fully well admit. But, for instance, I wouldn't pay more for 20% better raytracing performance but 20% worse rasterization performance - which is about how the line breaks down now, depending on what tier your looking at.
 
So basically, AMD's RT performance is a swing and a miss for its first attempt. I'm glad they are there but they need to take another stab at the way they go about it, cause the first go-around ain't all that pretty with respects to performance.
I'm going to swing toward giving AMD credit for having gotten the hardware out there in the first place. Same credit I give Nvidia for their 2000-series, the release of which has made possible games with decent RT implementations today. Same credit I'll give Intel and ARM whenever they get there.

Having the hardware out there isn't going to help gamers today, but it's an install base that's expanding and it's a second vendor implementing the function in mass-market consumer hardware, and that helps fence sitters join the crowd of developers supporting RT. Think of it this way: anyone that has waited until there was a market consensus formed on 'how to do RT' that wasn't just 'The Nvidia Way' now has their answer. By the time Intel catches up, any deviation from how things are settling now will work against Intel's product, and so there is a real disincentive for Intel to stray- turning that back around, it becomes highly likely that Intel (and everyone else) is going to go with the flow. Which means that the water's now safe to jump into.

First, as cool as Raytracing is - I don't play a single game that supports it yet, and even of the ones I am looking at getting, yeah... it would be single digit numbers for the near term. So it's nice, but it's far and away not my primary concern.
The tech geek in me wants it to be a major concern for myself, but the basic reality is that of the games I do play, ray tracing is not a killer feature. I'll note that our perspectives here are intensely personal, of course, and also likely temporary, but they're also worth consideration and shouldn't be discounted. Ray tracing is cool, but it hasn't hit 'critical mass' just quite yet, in my opinion.

Second - yeah, can't buy any of these anyway, from any manufacturer. So ... like @magoo says, moot point. Not to say this isn't a good writeup. It's great info to have. It just won't influence any of my near purchases because... there's nothing out there to purchase.

And that's the kicker ;)

Many of us could drop a K on a GPU if one worth such an expense were even available, but they're simply not!
 
Plays fine for me at 1440p with RT on... I think on medium? I didn't really check. Not even sure if it's making a huge difference fidelity or experience wise. Maybe if I was taking screenshots to post on instagram or something?
 
Plays fine for me at 1440p with RT on... I think on medium? I didn't really check. Not even sure if it's making a huge difference fidelity or experience wise. Maybe if I was taking screenshots to post on instagram or something?
Ray tracing shouldn't be like turning on 'salesman mode' on a TV in your dark basement- done right, you shouldn't notice it. It's the absence of immersion-breaking shadow and color artifacts that defines ray tracing, not some in-your-face effect!
 
Ray tracing shouldn't be like turning on 'salesman mode' on a TV in your dark basement- done right, you shouldn't notice it. It's the absence of immersion-breaking shadow and color artifacts that defines ray tracing, not some in-your-face effect!

It takes so much more GPU power so you don't notice it. Sounds like a salesman line trying to justify to a customer on the brink to get the XBR model TV back in the days a WEGA TV's and such. ;)
 
It takes so much more GPU power so you don't notice it. Sounds like a salesman line trying to justify to a customer on the brink to get the XBR model TV back in the days a WEGA TV's and such. ;)
You notice when things are out of place. RT is designed to put them into place. So yeah, the absence of wacky rasterization artifacts is what you should notice.
 
You notice when things are out of place. RT is designed to put them into place. So yeah, the absence of wacky rasterization artifacts is what you should notice.
Well, there has always been two schools of thought with regard to things like this. The TV Analogy @Grimlakin made is pretty good. Speakers/ audio systems would be another good case.

Some people thing the screen (and in this case by extension, the GPU) is just there to translate the media. You shouldn't notice anything imparted by the technology, unless it's deficient somehow in showing the media. You want to have as accurate and precise translation of the media as you can, and the tech is just there to serve it to the end user. This would be like Cinema mode on the TV where it tries to duplicate the theater experience, or my HT Receiver has a Pure Direct option, where it does no post-processing and tries to present the sound exactly as it's presented by the media.

Another camp thinks of the tech as part of presentation - it's not just there as a method to serve the media, it's part of the experience itself. Crank the bass up, saturate the colors - take the original media and make it bigger and bolder.

I can see both, and I'm often guilty of both. Right now, I think Raytracing tends to fall into the latter category - not everyone can do it, so you can't really make a mainstream game rely on RT for anything other than putting in over-the-top effects. Eventually it may fall into the former - maybe as soon as the tail end of this generation of consoles, as RT hardware becomes more ubiquitous and alternative methods of implementing RT work around any lack of hardware.
 
I think we are quickly going to hit a moment when CPU's and current and even last gen GPU's can step up to the plate with a new way to think of the reflections and shadows in the game to render them as part of the scene without additional passes for proper reflections. Doing the reflections in code before passing to the GPU to render as an example. Just making that process ultra efficient and bundling into fewer fatter data streams that the CPU is better at parsing and handling than the GPU would be with it's millions of parallel processes.
 
Some people thing the screen (and in this case by extension, the GPU) is just there to translate the media. You shouldn't notice anything imparted by the technology, unless it's deficient somehow in showing the media. You want to have as accurate and precise translation of the media as you can, and the tech is just there to serve it to the end user.
I think perhaps a better way for me to put it would be... you notice these things when you go back. That's when their absence is felt. It's the 'we didn't know any better' answer to the future 'how did we ever even live with that?!?' question.

I think we are quickly going to hit a moment when CPU's and current and even last gen GPU's can step up to the plate with a new way to think of the reflections and shadows in the game to render them as part of the scene without additional passes for proper reflections. Doing the reflections in code before passing to the GPU to render as an example. Just making that process ultra efficient and bundling into fewer fatter data streams that the CPU is better at parsing and handling than the GPU would be with it's millions of parallel processes.
I'll say that CPUs are definitely not the way forward. Ray tracing, like many other data processing types, is best done on dedicated hardware. I'm not saying that GPUs are that, as they are also definitely not dedicated to that purpose (and can't be for the foreseeable future, technically speaking), but they're still several orders of magnitude better than CPUs.

CPUs do branching code well. Anything that's not branching, or doesn't rely on massive in-order instructions (thus single-threaded) is better done on dedicated hardware. CPUs have this already in various forms of SIMD like SSE, AVX, the old MMX, but ray tracing is on a completely different scale.

As far as alternative means of lighting and refining the process in general, well, that has to happen regardless. Especially if any decent ray tracing will ever happen on the current console generation!
 
I went to Microcenter on launch day to buy a 6900XT as stock on NVIDIA cards was bad. I figured the RTX 3090 would be the faster option overall, but I figured if we can get 90% of the performance for $500 less, it would be a great option. I'm glad I didn't get the 6900XT. No DLSS and **** ray tracing performance would have disappointed me. It's a non-starter for Cyberpunk 2077 at 4K. A game I have hundreds of hours in.
 
I think we are quickly going to hit a moment when CPU's and current and even last gen GPU's can step up to the plate with a new way to think of the reflections and shadows in the game to render them as part of the scene without additional passes for proper reflections. Doing the reflections in code before passing to the GPU to render as an example. Just making that process ultra efficient and bundling into fewer fatter data streams that the CPU is better at parsing and handling than the GPU would be with it's millions of parallel processes.
That makes no sense. You're just adding latency. You can't do reflections without first rendering the scene in the first place, and you can't render them as "part of the scene" unless you are fine with static reflections and shadows. Technically, reflections and shadows are already "in code," but they need to be projected based on the perspective of the view frustum which can't be done without direct access to the video card's memory.
 
That makes no sense. You're just adding latency. You can't do reflections without first rendering the scene in the first place, and you can't render them as "part of the scene" unless you are fine with static reflections and shadows. Technically, reflections and shadows are already "in code," but they need to be projected based on the perspective of the view frustum which can't be done without direct access to the video card's memory.

I'm not disagreeing with you. I obviously don't know the backend code. It's just a gut feeling. May be way off but there it is.
 
It is obvious that the developers designed around RTX hardware and not Amds for RT. Newer games designed around AMD hardware, consoles, are doing well with AMD RT. FSR now available should make some inroads for using RT with AMD hardware.

AMD design with Infinity Cache does better with multi shaders/compute operations combined into a mega shader where the cache will be much more efficient.

Have to see how newer games like Farcry 6 with RT and FSR performs.
 
It is obvious that the developers designed around RTX hardware and not Amds for RT. Newer games designed around AMD hardware, consoles, are doing well with AMD RT. FSR now available should make some inroads for using RT with AMD hardware.

AMD design with Infinity Cache does better with multi shaders/compute operations combined into a mega shader where the cache will be much more efficient.

Have to see how newer games like Farcry 6 with RT and FSR performs.

How are they designing it around RTX hardware when they're using the Microsoft DXR API?

There's no custom code that's done to differentiate between the red/green side. I suppose the only thing could be that developers chose to implement a RT method that is known to perform poorly on AMD hardware, but if it gives them the visual effect that they are looking for within the game, that's not necessarily a decision based upon performance but rather artistic direction.
 
How are they designing it around RTX hardware when they're using the Microsoft DXR API?

There's no custom code that's done to differentiate between the red/green side. I suppose the only thing could be that developers chose to implement a RT method that is known to perform poorly on AMD hardware, but if it gives them the visual effect that they are looking for within the game, that's not necessarily a decision based upon performance but rather artistic direction.

Unless you want to count the specifically written Nvidia denoiser for ray tracing that they have created. Not sure what magic this works behind the scenes in coding but it is nvidia specific to take advantage of specifically gated hardware. Not that I'm against that just saying it isn't all roses. ;)

Noted here: https://www.realtimerendering.com/raytracing.html

And also on nvidia's page. Here is the quote from the source.

"Denoising is critical for real-time DXR performance when using path tracing or other Monte Carlo techniques. Alain Galvan's summary posts on ray tracing denoising and machine-learning denoising are good places to start. Zwicker et al. give a state of the art report about this area; note that it is from 2015, however, so is not fully up to date. Intel provides free code in their Open Image Denoise filter collection. The Quake II RTX demo includes shader code for the A-SVGF filter for denoising. NVIDIA has a developer access program for their denoiser. "
 
Unless you want to count the specifically written Nvidia denoiser for ray tracing that they have created. Not sure what magic this works behind the scenes in coding but it is nvidia specific to take advantage of specifically gated hardware. Not that I'm against that just saying it isn't all roses. ;)

Microsoft's DXR does not have a specific hardware code path for developers to use, and DXR is what a vast majority of the ray tracing games are using.

NVIDIA *has* written custom extensions for Vulkan (prior to Vulkan RT being released), however, those only appear in Quake II RTX and Wolfenstein Youngblood. I don't see how what you linked contradicts that...
 
Become a Patron!
Back
Top