AMD Radeon RX 6800 XT’s Ray-Tracing Performance Falls Significantly Short of NVIDIA RTX, According to Early Tests

Couldn't care any less about ray-tracing technology. Really reminds me of the "PhysX" technology from years ago. Sure it made the games that supported it seem more realistic, but it was never a make or break decision for me. Only way I used it was after my old GPU was capable of handling the PhysX processing when I upgraded to a more powerful GPU.

Let me know how the cards handle normal graphics maxed out at 1080P and 4K.
Except PhysX is now used nearly everywhere. You just don't explicitly see it anymore because it is turned on implicitly. More games today use PhysX for their physics simulation than any other middleware out there. Ray tracing will eventually evolve the same way once we get past the initial hump of it being a marketing point.
 
Except PhysX is now used nearly everywhere. You just don't explicitly see it anymore because it is turned on implicitly. More games today use PhysX for their physics simulation than any other middleware out there. Ray tracing will eventually evolve the same way once we get past the initial hump of it being a marketing point.
It's also a great example of not worrying about a feature Gen 1. As I remember it, by the time that PhysX was actually used and something you might say "hey, look at this, its cool" in many games, the stand alone PhysX cards were not fast enough to actually do the job. You were much better off not buying a PhysX card and just waiting for it to actually be in games first.

I think this is the same trend for RT. By the time we actually see games really using RT as a serious feature, neither the 20X0 or 30X0 will be fast enough to actually run it. For this Gen, and maybe even next gen, go for the better raster card and take whatever RT stuff comes with it as a bonus, but not critical feature.
 
Except PhysX is now used nearly everywhere. You just don't explicitly see it anymore because it is turned on implicitly. More games today use PhysX for their physics simulation than any other middleware out there. Ray tracing will eventually evolve the same way once we get past the initial hump of it being a marketing point.
This is true, but almost nothing uses the Proprietary GPU accelerated PhysX anymore. So why buy into the GPU Proprietary RTX, and just wait for the ubiquitous multiplatform implementation to proliferate?
 
Not a surprise.

All the titles that use raytracing today were designed with the Nvidia implementation in mind. More general purpose applications will likely improve things, but no guarantee.

Either way, I don't think it is a big deal. I think we are several generations away from Ray Tracing being a minor game option you can switch on and off that is barely noticeable. My priority is still raster performance this gen. Maybe in a few years I'll be more concerned with RT.
 
I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

Interesting, huh?

I don't know about you, but I've been at this for a while. I always remember 60+ being the minimum target.

Maybe not during the Voodoo 1 era, but certainly since the GeForce 256 launched in 1999.

I remember in college from 1999 to 2003 when I was a HUGE Counter-Strike addict 60fps was universally considered the bare minimum to play. I was one of the lucky ones. The combination of my GeForce 2 GTS and later GeForce 3 TI500, my 22" Iiyama Visionmaster pro and my highly overclocked Socket A Athlons allowed me to vsync at 100hz and pretty much get a stable 100fps at 1600x1200, unheard of at the time.
 
I won't care about RT for another year or 2. I'll take a 6800XT with a full coverage water block.
 
Except PhysX is now used nearly everywhere. You just don't explicitly see it anymore because it is turned on implicitly. More games today use PhysX for their physics simulation than any other middleware out there. Ray tracing will eventually evolve the same way once we get past the initial hump of it being a marketing point.

It's not the same thing that the standalone "PhysX" cards were nor is it the same as when GPUs were able to first start doing the job on their own (as a second card). As you said, everything has Physics X now as will everything eventually have Ray-tracing without the need to buy specific hardware for it.

Being first on the block tax is all this is. Therefore, I still couldn't care any less about it.
 
It's not the same thing that the standalone "PhysX" cards were nor is it the same as when GPUs were able to first start doing the job on their own (as a second card). As you said, everything has Physics X now as will everything eventually have Ray-tracing without the need to buy specific hardware for it.

Being first on the block tax is all this is. Therefore, I still couldn't care any less about it.

Yeah, It's Compute based physics now, no longer "PhysX" right? Accomplishes the same goal, but PhysX was acquired by Nvidia and kept proprietary.

There was some problem with how the code of "PhysX" was written using x87 instructions resulting in it not being easily and efficiently ported. Can't remember the details.


And then - of course - Nvidia being Nvidia, they were *******s and blocked dedicated Nvidia PhysX GPU's from working if the primary GPU was AMD...
 
Yeah, It's Compute based physics now, no longer "PhysX" right? Accomplishes the same goal, but PhysX was acquired by Nvidia and kept proprietary.

There was some problem with how the code of "PhysX" was written using x87 instructions resulting in it not being easily and efficiently ported. Can't remember the details.


And then - of course - Nvidia being Nvidia, they were *******s and blocked dedicated Nvidia PhysX GPU's from working if the primary GPU was AMD...

I don't know what the technology is now, but I am pretty sure it's available on both AMD and Nvidia cards. While I think you can still use "PhysX" on older titles that had it with Nvidia cards I am pretty sure newer titles use the new standard whatever it is. Someone with more GPU knowledge than me would probably know more about it. I don't keep up with all that.
 
Yeah, It's Compute based physics now, no longer "PhysX" right? Accomplishes the same goal, but PhysX was acquired by Nvidia and kept proprietary.

There was some problem with how the code of "PhysX" was written using x87 instructions resulting in it not being easily and efficiently ported. Can't remember the details.


And then - of course - Nvidia being Nvidia, they were *******s and blocked dedicated Nvidia PhysX GPU's from working if the primary GPU was AMD...
No, I mean PhysX. It is a middleware solution for physics that is primarily CPU-based these days. Even console games are using it. It is integrated into Unreal Engine and Unity, notably, and runs on every system (PC, Switch, PlayStation 4, Xbox One). "Hardware-accelerated" PhysX that runs on a GPU is rare these days, but still used occasionally.
 
No, I mean PhysX. It is a middleware solution for physics that is primarily CPU-based these days. Even console games are using it. It is integrated into Unreal Engine and Unity, notably, and runs on every system (PC, Switch, PlayStation 4, Xbox One). "Hardware-accelerated" PhysX that runs on a GPU is rare these days, but still used occasionally.

Interesting. I wonder how they finally got around the legacy x87 problem.
 
I'd like to know at what level these consoles are implementing ray tracing. Considering they are both using AMD hardware.

PS5: 36 CU's
XB: 52 CU's
6800XT: 72 CU's
6900XT: 80 CU's

If these consoles are what ray tracing will be based on then I can't imagine either will perform good at all, or the ray tracing will be barely implemented that it won't be that noticeable.

My guess is it'll be used in very small quantities. My hope is that it'll be used for lighting (ambient occlusion of sorts) and not used for every single shiny puddle in the street since it brings nothing to the game. You don't need a ton of rays (comparatively) for ambient occlusion and it gives the most return for the value (in my opinion). This should allow some sort of quality settings (more/less rays for measuring light at specific points) which can scale with the GPU. Relfections the only limit is the # of times the ray can bounce, but you can't for example only do a reflection once every 10 pixels and expect it to look right. It's mostly an all or nothing approach that you can only scale by turning it on/off for specific objects. Screen space reflections work pretty well and it doesn't add that much realism to the game like proper lighting can do.
 
I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

Interesting, huh?

3DFX set the standard for 60 FPS "sweet spot" back in the 90s. It hasn't changed much since then other than going higher. The standard was set to match the displays own refresh rate which on most displays back then were 60hz and today most are still 60hz.
 
I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

Interesting, huh?
I remember playing and programming 320x200 and 320x240 (mode 13h unchained, aka x-mode or mode x depending on who you talked to). We've come a long way, but pixel density is getting to the point that increases aren't making as large of a visual difference as it used to. Going form 320x200 to 640x480 was a large difference to my eyes. Going up to 1080p also increased the perceived detail a lot. 1080p to 1440p is a smaller increase but still easily noticeable by most. 4k depends on the screen size on whether it's noticeable for me. Smaller screens I can't really tell the difference besides my icons are smaller if I don't set scaling. 8k seems like a solution looking for a problem at this point. I find things like better contrast, brightness and HDR bring a lot more to a screen than pushing more pixels. I would take a 1440p monitor with HDR over a 4k monitor without as the visual fidelity is much better. Of courser if I can have both, great, but my point was simply pushing more pixels doesn't always increase the visuals as much as other methods. As you said, it's a moving target though, 60fps used to be the gold standard, now 120, 144, etc are becoming the norm. But just as pushing higher resolutions is having limited returns, so does pushing higher frames... while most people would notice the difference betwen 30hz and 60hz... not as many would notice 60hz to 120hz.. and even less 144hz to 200+hz. I'm not saying nobody would notice, just the # of people who would benefit decreases significantly. Those are also typically the players who are turning off most of the visuals to get higher frame rates, so visual fidelity isn't their top priority, fast response times are.
 
4k depends on the screen size on whether it's noticeable for me. Smaller screens I can't really tell the difference besides my icons are smaller if I don't set scaling.

4k for me is all about immersive screen size, not an increase in visual fidelity. I sit at normal desktop computer distance (~2 to 2.5 ft) awa from my 43" 4k screen, and I love it.
 
32" is the sweet spot for me with 4K. Nice and sharp. Also great size for actual work without snapping your neck at desk distances.

Anyhow, looking forward to how AMD tackles RT, maybe even turn some of the Raster Elite Purists on to some fun. :)
 
32" is the sweet spot for me with 4K. Nice and sharp. Also great size for actual work without snapping your neck at desk distances.

Anyhow, looking forward to how AMD tackles RT, maybe even turn some of the Raster Elite Purists on to some fun. :)
I just can't go back from my 50" 4k TV, having 4 1080p rdp screens is a blessing for my job.
 
There are rumors of AMD RT implementation being inferior/lowerIQ/slower than nvidia. I don't really expect them to do better than the RTX2080Ti on their first try, but maybe a lightning can strike twice.
 
I remember a time when gamers use to whine about 720p@60 (and a stable 30 FPS) resolution was more than enough for games. Screw 1080+!

Now, 1440p@120 is the sweet spot and 40+ FPS is not even playable any more.

Interesting, huh?
And now console gamers will be able to play at 60fps and even 120fps at 1080p
 
There are rumors of AMD RT implementation being inferior/lowerIQ/slower than nvidia. I don't really expect them to do better than the RTX2080Ti on their first try, but maybe a lightning can strike twice.
Th leaks show it outperforming the 2080ti, but falling short on the 3080... If indicative of actual performance it's about in line with my expectations... Ok for a first try, but not as good as Nvidias second try. Will have to see how it matures a bit and real games get tested. I don't see how it would lower IQ for a given setting... If your casting out the same rays using the same api, you should be getting the same results. If you meant you have to lower quality settings to get the same speeds this would make sense, but it's more of an either or, not both.
I'm mostly excited to see how it does in games that use it for lighting as that's when RT makes the biggest difference to me.
 
Become a Patron!
Back
Top