Ray Tracing Performance Deep Dive: AMD Radeon RX 6800 XT vs Nvidia GeForce RTX 3080

I'd like to care since I'm running at 2080Ti and I'm always interested in the comparison but these cards are so rare and impossible to obtain I have already resigned myself to just waiting until whatever releases in 2022
 
So I've flip-flopped on this a few times. I've been rocking a GTX 980 for a good while now, on 4K. I'm due for an upgrade, but it isn't a Machine Down type situation. I have had really good luck with AMD drivers, contrary to a lot of folks, and have had good and bad cards on both sides of the fence, so I have no real brand loyalty here. I had been holding out for HDMI 2.1/DP 2.0, and it seems it's finally arrived.

Early on, I basically said I didn't care about Ray Tracing at all, I would get whatever card had the best performance/price for rasterizing fit my budget. And the 6800 didn't look bad, but it (at MSRP anyway) is right at the top of what I'm comfortable spending on a GPU.

Then I got to thinking... well, ray tracing... and nVidia has a sizeable lead there. Maybe it's worth it if the price premium isn't that big? 3070 wouldn't be horrible and (again, at MSRP) is a few dollars less, but it isn't nearly as good a overall performer (at least, without DLSS running, in titles that support that). The 3080 is a bit beyond what I want to spend on a card, but ... but ...

Then, one day, I looked at my gaming library. I own 0 titles that support ray tracing, or DLSS. And I don't exactly have a deep wish list on my game library either. So yeah... wouldn't really use it much, at least in the near term. And I wonder if I was drinking the Green Koolaid thinking I would consider paying for faster ray tracing.

Then... you know, it's not like I can buy either right now - nothing available. So really it would come down to whatever one I could get my hands on. I don't really care about ray tracing, but I think if all else were equal, I'd take it... but AMD and nVidia have done a really good job in making sure that things aren't equal. On the surface, at MSRP prices, the 6800 looks great, and it's fast enough to make the 3070 look lackluster in things that I play... but nVidia has made it so the 3070 still looks compelling at that price point... I don't know if I would feel like I'm settling for "just" a 3070 over a 6800, depends on if I could convince myself to good about having faster RT and DLSS available, even if I don't actually use them.

Realistically, since I'm rocking 4K on what everyone would (rightfully) call an inferior system now, and am not looking at upgrading to what everyone would (debateably) call a GPU strong enough for 4K - maybe it doesn't matter that much. Whatever I buy now is a stop gap for when I do get a card that is "4K Capable"... but then again, even a stopgap card for me could conceivably end up lasting 2-3 years.

So yeah... congrats to crappy availability in making the entire Ray tracing debate an entirely moot point for me.
 
I'll be honest Brain_B, I don't understand you at all. At 4K, you want the fastest card money can buy. For the life of me I can't see why anyone would step up to 4K if they weren't ready to spend a grand on video card/s (or more) from time to time. High resolution gaming has always demanded two cards in SLI until recently, where that wasn't an option. The RTX 2080 Ti was never an amazing 4K gaming card, but it was the best we had until the 30 series launched. The RTX 3080 is a great budget option for 4K, even if it isn't ideal. (It really should have more vRAM.)

Really, the waiting game makes no sense unless you are within a few months (less than six) to weeks of a major architecture's release. The RTX 3080, 3090, 6800XT and 6900XT are all "real 4K" capable cards. As graphics cards advance, so do games. Even if there aren't titles you want right now that support ray tracing or DLSS 2.0, that's not to say there won't be during the service life of the card. Also, if you keep a card 2-3 years, then going as big as you can will get you the best longevity out of it.

Really, a GTX 980 is pretty awful for 4K and always was. The Maxwell architecture actually did SLI pretty well and a pair of Maxwell Titan X's or GTX 980 Ti's weren't exactly good at 4K back then. A 3070 is at least roughly an RTX 2080 Ti equivalent, so there is that. Again, do what you want but 4K is and never was the realm of budget conscious gaming. I've been at 4K for over 4 years. Before that, I was at 7680x1600 for several years. That's worse than 4K. Pretty much since the first 2560x1600 monitors came out, I've been doing high resolution gaming. I've had to buy GPU's in pairs to do that until the RTX 2080 Ti. It wasn't that the RTX 2080 Ti was good enough to be on its own, but rather the fact that my GTX 1080 Ti's rarely worked in SLI and the RTX 2080 Ti's didn't either. They were supported in so few titles, that going with the most powerful single-GPU (and the newest) was the only way forward.

On a budget, the RTX 3070 isn't too bad. It's roughly equivalent to an RTX 2080 Ti. It will struggle with 60FPS in some titles, but it can easily do it in others. Turn one or two settings down in modern games and you'll be good to go. Even Cyberpunk 2077 is playable using DLSS 2.0 and ray tracing on an RTX 2080 Ti at 4K. Albeit, DLSS needs to be on performance or balanced to do it. The RTX 3080, 6800XT/6900XT from AMD are better choices by a mile, but they are considerably more expensive. The RTX 3090 is a relatively poor value compared to the other cards mentioned, but its certainly the best 4K card on the market bar none. I considered the RTX 3090 to be a bargain, not because of what else is out there but because of what I'm used to paying for GPU's. I have been paying $1,500+ for years buying two cards. I've also paid as much as $3,000 for a pair of Titan X's in SLI. By that metric, an RTX 3090 is a no brainer for the best 4K gaming available.
 
Last edited:
I'll be honest Brain_B, I don't understand you at all. At 4K, you want the fastest card money can buy.
Sharp text and pics mean more to me than framerates, and IPS 4K delivers on that.. I use the computer more for research (web browsing) and spreadsheets than I do gaming, and what gaming I do tend to be older (MMO/ARPG) games that aren't exactly very taxing on any system.

But thanks for the advice. You are sorta right - I already had the GTX 980 when I upgraded to the 4K monitors -- I didn't go out and buy a GTX980 to play in 4K. I bought this 980 Strix when they first came out -- Nov '14 I think, so it's been around the block a few times now. I was originally driving a pair of 1920x1200 monitors, and maybe ... 3 years ago my ancient 12 yr old Dell finally started to bother me with yellowing color.

The plan was to see how 4K did with the system. Framerates were better on the older monitors, certainly, but in everything I play I've been able to find some compromise of in game settings that seems to work out acceptably. I would always get some titles that would get thrown off by the quirky 16:10 ratio, now I get occasional titles that get thrown off by the high resolution - so I traded one headache for another. Overall though, for applications that support Windows Scaling (which, oddly, apart from games seems to be pretty much everything but older Windows components) - I'm happy with the monitor upgrade, and I can't really see going back down in resolution just to add to framerates.

And then I set my priorities for wanting to finally upgrade: if/when I upgraded again, I wanted to keep 4k, and I wanted the option of HDR, 120Hz, and VRR. Those are just now available in monitors/TV sets, and it wasn't until HDMI 2.1/DP 2.0 that it was really a possibility with GPUs -- which hasn't come until this most recent generation. So I waited on the GPU upgrade until now.

I don't really recommend 4K gaming, and I think the commonly given advice of go for refresh rate over resolution isn't bad advice at all. But when it comes to text and static image quality, it's really hard to beat raw PPI - which is why I like my smaller (27") 4K monitors.

As far as waiting goes - right now I have little choice... either I get lucky and happen to catch something in stock (I peek occasionally, but I'm not camping out pounding F5), or I pay a scalper (not going to happen), or I wait. I would have bought a RX6800 on the spot had they been available, and failing that, I would have bought a 3070 had they been available. I still would jump on either, and honestly, would probably jump on a 3080 if I could find a lower tiered one <$800 - but I think that ship has sailed and I've pretty well resigned to just putting up with what I got until it doesn't turn on anymore...

I have less qualms about spending a good deal of money on a good monitor. I tend to get 8-12 years out of them, that's typically been a few computer builds. My current build I put together at the end of 2014, and then family life kinda hit - a home move, other priorities, less gaming, etc. So I haven't been chasing hardware for a while. That, and while new computers are certainly faster, I haven't hit anything that I want to play that I can't - stuff has just kinda leveled off with regard to performance, while prices have continued to disproportionally inflate. None of it has really made me too excited to build anything, at least until recently.
 
Last edited:
Here is something interesting. I got a 6800xt as you see in my signature. Thats my system. I play wow shadowlands again and in that everywhere but shafowlands I hit the cap I set of 120hz. But in shadowlands it drops down to the low 80s at times. That's with everything at max 1440p.
 
I have an RTX 2080 Super now for my home system, and I had to dial down my settings from max to 5 on the slider. Had FPS values in the 80's when soloing, but in raids it drops down to 30-40 FPS... :( I am wondering if its the game being poorly optimized in general. These values are also with RTX off for the game (no ray tracing).
 
I have an RTX 2080 Super now for my home system, and I had to dial down my settings from max to 5 on the slider. Had FPS values in the 80's when soloing, but in raids it drops down to 30-40 FPS... :( I am wondering if its the game being poorly optimized in general. These values are also with RTX off for the game (no ray tracing).

Yea mine are with RTX on. But I suspect that is an issue with wow's engine updates. Lets see if they get it fixed in short order. The Ray Tracing being turned on I can't really tell a substantive difference in performance.
 
I'll be honest Brain_B, I don't understand you at all. At 4K, you want the fastest card money can buy. For the life of me I can't see why anyone would step up to 4K if they weren't ready to spend a grand on video card/s (or more) from time to time. High resolution gaming has always demanded two cards in SLI until recently, where that wasn't an option. The RTX 2080 Ti was never an amazing 4K gaming card, but it was the best we had until the 30 series launched. The RTX 3080 is a great budget option for 4K, even if it isn't ideal. (It really should have more vRAM.)

Really, the waiting game makes no sense unless you are within a few months (less than six) to weeks of a major architecture's release. The RTX 3080, 3090, 6800XT and 6900XT are all "real 4K" capable cards. As graphics cards advance, so do games. Even if there aren't titles you want right now that support ray tracing or DLSS 2.0, that's not to say there won't be during the service life of the card. Also, if you keep a card 2-3 years, then going as big as you can will get you the best longevity out of it.

Really, a GTX 980 is pretty awful for 4K and always was. The Maxwell architecture actually did SLI pretty well and a pair of Maxwell Titan X's or GTX 980 Ti's weren't exactly good at 4K back then. A 3070 is at least roughly an RTX 2080 Ti equivalent, so there is that. Again, do what you want but 4K is and never was the realm of budget conscious gaming. I've been at 4K for over 4 years. Before that, I was at 7680x1600 for several years. That's worse than 4K. Pretty much since the first 2560x1600 monitors came out, I've been doing high resolution gaming. I've had to buy GPU's in pairs to do that until the RTX 2080 Ti. It wasn't that the RTX 2080 Ti was good enough to be on its own, but rather the fact that my GTX 1080 Ti's rarely worked in SLI and the RTX 2080 Ti's didn't either. They were supported in so few titles, that going with the most powerful single-GPU (and the newest) was the only way forward.

On a budget, the RTX 3070 isn't too bad. It's roughly equivalent to an RTX 2080 Ti. It will struggle with 60FPS in some titles, but it can easily do it in others. Turn one or two settings down in modern games and you'll be good to go. Even Cyberpunk 2077 is playable using DLSS 2.0 and ray tracing on an RTX 2080 Ti at 4K. Albeit, DLSS needs to be on performance or balanced to do it. The RTX 3080, 6800XT/6900XT from AMD are better choices by a mile, but they are considerably more expensive. The RTX 3090 is a relatively poor value compared to the other cards mentioned, but its certainly the best 4K card on the market bar none. I considered the RTX 3090 to be a bargain, not because of what else is out there but because of what I'm used to paying for GPU's. I have been paying $1,500+ for years buying two cards. I've also paid as much as $3,000 for a pair of Titan X's in SLI. By that metric, an RTX 3090 is a no brainer for the best 4K gaming available.

Why do people keep stating that the RTX3080 has very little memory? Where's the evidence to support that? I mean even Cyberpunk or watchdogs legion doesn't seem to be affected by "only" having 10gb of ram.

The RTX2070Ti and 3070 perform pretty much the same @4k in spite of the ram difference.

Oh I know, future games...
 
Become a Patron!
Back
Top