AMD Radeon RX 6800 XT’s Ray-Tracing Performance Falls Significantly Short of NVIDIA RTX, According to Early Tests

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,871
Points
113
amd-radeon-rx-6800-xt-front-angle-render-1024x576.jpg
Image: AMD



AMD shocked the graphics world today by unveiling its Radeon RX 6900 XT, Radeon RX 6800 XT, and Radeon RX 6800 cards, which – according to red team’s first-party benchmarks – can trade blows with NVIDIA’s entire GeForce RTX 30 Series lineup (GeForce RTX 3090 included).



Something that CEO Dr. Lisa Su and Scott Herkelman (CVP & GM) seemed awfully secretive about, however, is the...

Continue reading...
 
First I see the reviews .. then I look at prices .. then I buy .. Ray Tracing performance is not a big deciding factor right now .. shoot, I might even skip this generation as I only really play one game and I only play at 1440p .. for that, my blower style 5700xt is working great.
 
It is too early to tell. This could be due to an immature driver and/or firmware.

Be nice if AMD had something to say on the topic. I dont know RT will be a big issue for me. Not much of what I play uses it.
 
I agree that it's too early to tell. Optimizations, learning the ins and out of the architecture, drivers and such are all going to play a part in performance over time. Raytracing in games is still in its infancy for AMD and nVidia which is a major factor.

That said, this is AMD's first generation of hardware specifically for this and that could be a disadvantage, but not necessarily. Being first does not mean the competition is better even with second generation hardware. At minimum it will have an effect right now because the only hardware out there has been nVidia's and efforts have been focused on that. This still doesn't mean AMD will manage to be equivalent to nVidia even after everything has been worked out.

The fact that AMD is also in consoles plays a part. This could be good or bad again. First generation hardware in consoles is not necessarily going to be good. There are a lot of tradeoffs in console hardware and this could come back to bite AMD. On the upside, because AMD is in the consoles will likely mean more in-depth optimization and work towards getting the most out of console hardware and some of that should bleed over into the PC space.

This all said, it's not going to surprise me if AMD falls short of nVidia on RT this round. I actually expect this to be the case. The upcoming Radeons look **** good but the performance/feature deficit AMD has been running in the video card space for a while isn't something you can easily overcome in every facet in one leap.

Besides that, I don't really know how much effect it will really have. RT is still in its infancy and raster is still king. The number of games with any RT features is paltry even after more than two years of promised RT games. I'm of the opinion it's going to be at least one more new hardware generation before RT sees any real gains as a required feature.
 
"The number of games with any RT features is paltry even after more than two years of promised RT games."

Except AC: Valhalla, every major AAA game that releases at the end of this year will enable ray-tracing.

Far Cry 6, Dirt 5, CP 2077, GodFall, WoW, Watch Dogs: Legion, Spiderman: MM, COD: BO, etc.

So, RT is moving foward.
 
"The number of games with any RT features is paltry even after more than two years of promised RT games."

Except AC: Valhalla, every major AAA game that releases at the end of this year will enable ray-tracing.

Far Cry 6, Dirt 5, CP 2077, GodFall, WoW, Watch Dogs: Legion, Spiderman: MM, COD: BO, etc.

So, RT is moving foward.
Dude, go back to NVIDIA and stop posting this stupid crap, we all know you're just an nvidia shill. A snail moves forward to, but nobody calls it a cheetah. Yes, RT is moving forward, it's been doing that since before I first started working with it in the 90's... if it was moving backwards I'd be worried. Seriously, you're either completely obsessed with nvidia or someone is paying you to post your nonsense here. You've only got 6 posts and every single one is nonsense about AMD being crap and NVIDIA being the best thing since sliced bread. Seems this release has NVIDIA worried enough to start getting all of their payed posters back into the action, I guess that's a good sign for AMD :).
 
Wow! You are disrespecting me because the numbers of RT games disagree with this misinformation?

You gotta be a lot a fun at your cheese-parties, huh?!
 
Wow! You are disrespecting me because the numbers of RT games disagree with this misinformation?

You gotta be a lot a fun at your cheese-parties, huh?!
 
Lets be honest here. These games that have RT are still 95% raster with a sprinkling of shiny RT added. There isn't enough power in any GPU on the market to implement any higher levels of RT. Which is why we're years and generations away from meaningful RT implementations.
 
Lets be honest here. These games that have RT are still 95% raster with a sprinkling of shiny RT added. There isn't enough power in any GPU on the market to implement any higher levels of RT. Which is why we're years and generations away from meaningful RT implementations.
'Meaningful' RT implementations are subjective, which I point out only because we don't have a good consensus on what actually would be meaningful. That's an entire series of discussions unto itself :).

But with respect to mixed workloads between raster and RT, that's more or less how it has to be. Raster is just plain orders of magnitude more efficient at drawing pixels themselves, so it makes sense to gradually shift work to RT hardware as it becomes available.

We'll also not likely see rasterization disappear completely for similar reasons. Even cinematic rendering suites use a hybrid approach, as rasterization even in a limited implementation backed up by healthy RT input can significantly reduce rendering times while producing the same final output.
 
Big giant headline based on what?
Some guy on reddit?
Yeah, Im not buying anything because of what some random guy says.
It does go to figure that by common sense reasoning AMD is behind in this one aspect.
But the simple good news is they are finally, once again stepping up to the top tier. Good bad or indifferent, thats good for everything. RT is still early and only a small part of a game to date.....it has obviously great promise.....for that I am anxious.
But for now, congrats AMD.....but lets see the reviews.
 
"The number of games with any RT features is paltry even after more than two years of promised RT games."

Except AC: Valhalla, every major AAA game that releases at the end of this year will enable ray-tracing.

Far Cry 6, Dirt 5, CP 2077, GodFall, WoW, Watch Dogs: Legion, Spiderman: MM, COD: BO, etc.

So, RT is moving foward.
I don't have the list and don't particularly care but would you care to look up the list of games nVidia promised RT in with the release of the 2xxx series and see how many of them actually have any RT features much less usable or noticeable ones? I saw it posted recently and it didn't look pretty. That was over two years ago these games were promised RT features and that list isn't complete.

As for future games, I'll believe it when they deliver it. Even better, I'll believe it when they deliver it and turning on something other than the most basic settings doesn't kill performance. New hardware power is going to increase game resource usage for normal performance increases and IQ. There are still plenty of games out now which struggle with higher IQ settings and some aren't even playable with the highest. The new cards are playing catchup on many of these.

This doesn't even take into account that RT performance metrics are almost always based off the most powerful halo card which very few people own. What happens once you start moving down the product stack? We already saw with nVidia's 2xxx series than basically anything under a 2080Ti was marginal at best with RT and effectively useless on 2060s and 2070s. RT adoption is going to require a top to bottom hardware product stack which can make at least minimal usefulness of features. Neither company is anywhere near that point yet. It's going to be a minimum of one more architecture advancement before that has a chance of happening and likely at least two. Maybe by the time the consoles do their mid-life refresh with new AMD hardware we might finally see something.
 
Benchmarks leaked earlier by VideoCardz showed ray tracing on the 6800XT to be about on par with a 2080 Ti. Seems AMD are doing the bare minimum at the moment. I'll see what real reviews show, but I'm leaning back toward NVIDIA.
It is too early to tell. This could be due to an immature driver and/or firmware.

Be nice if AMD had something to say on the topic. I dont know RT will be a big issue for me. Not much of what I play uses it.
I can't tell you how many times I've heard "Just wait for the drivers" over the years of AMD releases. The HD 2900 XT release was a really fun time.
 
But, RT now has its big DADDY to lead the next frontier for it now.

The next-gen consoles.

Like it or not, the consoles are the foundation for today's and tomorrow's games, and Sony and MS's future games are 150% in support for RT games.

It is what it is.
 
I'd like to know at what level these consoles are implementing ray tracing. Considering they are both using AMD hardware.

PS5: 36 CU's
XB: 52 CU's
6800XT: 72 CU's
6900XT: 80 CU's

If these consoles are what ray tracing will be based on then I can't imagine either will perform good at all, or the ray tracing will be barely implemented that it won't be that noticeable.
 
If these consoles are what ray tracing will be based on then I can't imagine either will perform good at all, or the ray tracing will be barely implemented that it won't be that noticeable.

I'm not really expecting a big push for raytracing this generation (except for a few standout attempts by AAA studios). There was an interview with Spencer downplaying the benefit of raytracing (believe he prioritized FPS over RT, iirc).
 
Couldn't care any less about ray-tracing technology. Really reminds me of the "PhysX" technology from years ago. Sure it made the games that supported it seem more realistic, but it was never a make or break decision for me. Only way I used it was after my old GPU was capable of handling the PhysX processing when I upgraded to a more powerful GPU.

Let me know how the cards handle normal graphics maxed out at 1080P and 4K.
 
Couldn't care any less about ray-tracing technology. Really reminds me of the "PhysX" technology from years ago. Sure it made the games that supported it seem more realistic, but it was never a make or break decision for me. Only way I used it was after my old GPU was capable of handling the PhysX processing when I upgraded to a more powerful GPU.

Let me know how the cards handle normal graphics maxed out at 1080P and 4K.
Same way I feel about it
 
I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

Interesting, huh?
 
Become a Patron!
Back
Top