Godfall RX 6900 XT vs RTX 3090 FSR Ray Tracing Performance

Brent_Justice

Administrator
Staff member
Joined
Apr 23, 2019
Messages
792
Points
93
godfall_banner-1024x279.png




Introduction



This is a quick and dirty performance-oriented article looking at Godfall’s FSR and Ray Tracing performance between a Radeon RX 6900 XT and GeForce RTX 3090. We are not going to look at image quality or talk about that. We are going to focus on the performance benefit that FSR (AMD FidelityFX Super Resolution) provides in Godfall on these two GPUs. We are also going to enable Ray Tracing in the mix to see how that performance with and without FSR.



Godfall is an action role-playing game developed by Counterplay Games and published by Gearbox Publishing and was released on November 12th of 2020. This game runs on the Unreal Engine 4. As of November 18th, 2020 a patch was released, patch 2.0.95 update which brought Ray Tracing support to Radeon RX 6000 series GPUs. It wasn’t until...

Continue reading...
 
An IQ comparison would've been nice.

Follow up article maybe?
 
An IQ comparison would've been nice.

Follow up article maybe?

Never say never.

But this article was focused on the performance aspect. Naturally, anything that isn't the native resolution will be of a lower quality, it just depends by how much of a downgrade in quality, and if it's noticeable. You enter into FSR knowing the potential of a downgrade in image quality. The point and goal of enabling FSR is to increase performance. That's the only reason you enable it.
 
Last edited:
Never say never.

But this article was focused on the performance aspect. Naturally, anything that isn't the native resolution will be of a lower quality, it just depends by how much of a downgrade in quality, and if it's noticeable. You enter into FSR knowing the potential of a downgrade in image quality. The point and goal of enabling FSR is to increase performance. That's the only reason you enable it.

I agree for the most part, thing is all upscaling technologies promise high performance gains vs "minimal" IQ drop. So it would be nice to know to what extent the promise is delivered.

For better or for worse upscaling is here to stay, everyone and thier mother in law are coming up with their own version be it hardware (DLSS, xeSS) or software (FSR, UE5). It will be interesting to see how they all stack up.
 
The 6900xt did better than expected. This title might just be an anomaly in that regard though. Who knows.

The standout to me is that at 4K with RT and high quality FSR/DLSS settings, both cards achieve completely playable framerates.

Never say never.

But this article was focused on the performance aspect. Naturally, anything that isn't the native resolution will be of a lower quality, it just depends by how much of a downgrade in quality, and if it's noticeable. You enter into FSR knowing the potential of a downgrade in image quality. The point and goal of enabling FSR is to increase performance. That's the only reason you enable it.

I'm interested in seeing this as well.

I've never seen either FSR or DLSS in person, but when DLSS 2.0 was launched, some reviews were suggesting that it was a tradeoff in image quality, and that in some ways it could actually look better/sharper than native.
 
I've never seen either FSR or DLSS in person, but when DLSS 2.0 was launched, some reviews were suggesting that it was a tradeoff in image quality, and that in some ways it could actually look better/sharper than native.
That's never been my experience with it. Granted, Cyberpunk 2077 is the only place I need DLSS 2.0 using my RTX 3090. It's also the only game I play that has ray tracing implemented to a degree that impacts performance negatively to a point where DLSS 2.0 is required to get desirable frame rates.

In that game, image quality definitely suffers compared to native resolution. I've seen a few screenshots of other games where DLSS 2.0 arguable improved some aspect of image quality, but it tends to negatively impact it somewhere else.
 
Cyberpunk 2077 is a very bad example of DLSS. DLSS relies heavily on TAA implementation in the game, and Cyberpunk 2077's TAA implementation is notoriously bad. Therefore, that being tied to it, it worsens DLSS from its potential.

There are other games that demonstrate DLSS better.

However, that is one thing to note about DLSS, it relies on temporal vectors, so the implementation of that in games will affect its quality. But at least it can do it, whereas FSR is spatial only at the moment.
 
Cyberpunk 2077 is a very bad example of DLSS. DLSS relies heavily on TAA implementation in the game, and Cyberpunk 2077's TAA implementation is notoriously bad. Therefore, that being tied to it, it worsens DLSS from its potential.

There are other games that demonstrate DLSS better.

However, that is one thing to note about DLSS, it relies on temporal vectors, so the implementation of that in games will affect its quality. But at least it can do it, whereas FSR is spatial only at the moment.
Thanks for the explanations on the differences between the two. Most of my gaming in the last couple of years has focused on games with DLSS and I've noticed the anomalies throughout the generations. I've recently been replaying RE Village and taking a closer note of FSR and its behavior since it wasn't around when I played it the first time.
 
Excellent take, also very surprising on 6900XT RT performance. I think as newer games with RT come out, the disparity for AMD and Nvidia on performance will drift closer together. Still I think Nvidia will remain on top due to larger bandwidth memory as well as more compute units. AMD speed demons do help compensate as well as Infinity Cache (really a 256bit memory bus with slower memory being in the ballpark does speak highly on AMD design even beit slower at this time).
 
Excellent take, also very surprising on 6900XT RT performance. I think as newer games with RT come out, the disparity for AMD and Nvidia on performance will drift closer together. Still I think Nvidia will remain on top due to larger bandwidth memory as well as more compute units. AMD speed demons do help compensate as well as Infinity Cache (really a 256bit memory bus with slower memory being in the ballpark does speak highly on AMD design even beit slower at this time).

I disagree.
Godfall only uses DXR for Raytraced Shadows, meaning it is a DXR-lite game, where AMD does better than games using multiple DXR effects.
If you look at titles doing a LOT more DXR...the pattern is vey clear.
NVIDIA has a major performance benefit when multiple DXR effects are used at the same time (Shadows, reflections, Lighting, GI) over AMD.
Games like CyberPunk 2077, Metro Exodus and Control eg.

If anything this shows AMD tansk performance when not doing "DXR-lite".
 
I will say that later titles are using a more focused implementation.
 
I will say that later titles are using a more focused implementation.
You need to eleborate on "focused"?
The trend is quite clear the more more DXR effects, the more NVIDIA and AMD performance seperates.
 
You need to eleborate on "focused"?
The trend is quite clear the more more DXR effects, the more NVIDIA and AMD performance seperates.
What I am saying is games are being made to look as good as they can on console and its eroding the advantage that Nvidia has simply because there isn't the same depth of use.
 
What I am saying is games are being made to look as good as they can on console and its eroding the advantage that Nvidia has simply because there isn't the same depth of use.
I feel that this is true, and that it seems to follow the collective recoil that the industry had upon the first more 'fully-featured' ray tracing implementations such as BF:V, which was deemed a bit excessive versus the appreciable visual benefits.

However, I would stop short of coaching this in a positive light. There's no real doubt that AMDs ray tracing implementation is less potent in their RX6000-series versus Nvidia's RTX 3000-series.

Whether that matters is absolutely game, system, and user dependent, but we should still be clear in this distinction because we have no reason to suspect that the use of ray tracing will trend down over time. Yes, we're seeing more limited use in current console games due to the limited ray tracing hardware present in those consoles, and at the same time as developers learn how to develop games with variable levels of ray tracing we can almost certainly expect 'more involved' versions that allow for higher settings on current and future hardware that can handle them.
 
I think, speaking strictly technically - nVidia clearly has a stronger hardware solution. But - AMD has ~a~ solution now, and it's prominently featured in both major consoles (that seem to be selling like hotcakes but no one can get their hands on).

Speaking personally - RT just doesn't excite me. It's one of those technologies where I can see a difference if I really look in static screenshots, but when I'm playing a game - right now it just doesn't make a difference and the only appreciable difference between RT on and off is that I can notice the framerate hit.

Better Raytracing simply isn't a consideration on if I buy a piece of hardware, or a game. Well, let me take a step back - because that's assuming you even have a choice in hardware. Right now, it's take what you can get with respect to hardware.

Until we get more titles where raytracing is either required or makes a significant impact, I'd expect it to remain a parlor trick. And even if/when it does catch on, I expect it to become something ubiquitously supported in the background, like physics libraries are today. No one cares which hardware can run physics any better than the other now, but 20 years ago we sure geeked out about it.
 
I'm still excited for the potential of RT, as it's that one step closer needed to get realistic shadows, lighting, and color - which are all actually different parts of the same thing (light intensity at particular frequencies, or the lack thereof).

It's something that expanded color and luminosity spaces, meaning stuff like Rec.2020 and actual HDR, really need to be truly useful in games, otherwise color banding and shadow inaccuracies will continue to inhibit immersion.


That all said - even the best hardware solutions run in optimal conditions aren't there yet. Where I would personally have a preference, if having a preference happened to even be an option, would be for the other technologies that Nvidia has delivered to market - DLSS and "RTX"-backed audio processing capabilities. DLSS alone is enough of a differentiator, IMO, especially if one is compromising on performance due to price and availability.
 
What's happening now is developers are trying various types of rt. Next generation or so will see more features combined artistically.
 
What I am saying is games are being made to look as good as they can on console and its eroding the advantage that Nvidia has simply because there isn't the same depth of use.
That is a very simplistic view on a very complex problem.

Ray Tracing can happen at different hardware levels:

Level 0 – Legacy Solutions (CPU's)
Level 1 – Software on Traditional GPUs (NVIDIA Pascal)
Level 2 – Ray/Box and Ray/Tri Testers in Hardware (AMD console GPU's/RDNA2)
Level 3 – Bounding Volume Hierarchy (BVH) Processing in Hardware (NVIDIA Turing/Ampere)
Level 4 – BVH Processing with Coherency Sorting in Hardware
Level 5 – Coherent BVH Processing with Scene Hierarchy Generator in Hardware

The next level (Level 4) is regarding this:
1642060675910.png

That is where the focus will be.
Not on "If we only do one DXR effect, the consoles can kinda keep up"
Several games haven shown an willingness to FAR exceed the DXR features and image quality available on the consoles.
If you think consoles will "hold" back DXR implementation I suspect you have misunderstood the benefits/scalability of implementing DXR in games.
 
Become a Patron!
Back
Top