NVIDIA RTX Muddies AMD Radeon RX 6000 Series’ Level of Ray-Tracing Support

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,391
Points
83
amd-radeon-rx-6900-xt-top-1024x576.jpg
Images: AMD



It’s too early to tell how AMD’s Radeon RX 6000 Series graphics cards will cope with ray tracing, but what we can definitely say is that they’ll include pervasive support of the demanding rendering technique.



AMD’s marketing department made that abundantly clear in a new statement today, confirming that the Radeon RX 6900 XT, Radeon RX 6800 XT, and Radeon RX 6800 will support most of today’s ray-traced titles, many of which leverage Microsoft’s DirectX API.



“AMD will support all ray tracing titles using industry-based standards, including the Microsoft DXR API and the upcoming Vulkan raytracing API,” the...

Continue reading...
 
Last edited by a moderator:
I am all for open standards and history has proven that those win out over time.
 
AMD in it for the long haul ... marathon, not a sprint
 
NVIDIA is using DXR and Vulkan Ray Tracing just like AMD. "RTX" is just branding for their middleware that takes advantage of NVIDIA-specific hardware.
 
NVIDIA is using DXR and Vulkan Ray Tracing just like AMD. "RTX" is just branding for their middleware that takes advantage of NVIDIA-specific hardware.
I always wondered about that.
 
In most cases yes but for cyberpunk and the other listed titles they must be using gameworks because they knew amd would have an rt card around it's launch. Too bad for them I'm upgrading to ryzen very soon and I'm going to upgrade my 2080ti for a 6900xt. That combo performance boost can't be ignored and neither can the price difference.

CDPR will likely update the game with support for AMD using dx12 because they are already adding it to the consoles therefore it must work on amd hardware.
 
CDPR will definitely update the game to use regular dx12 RT functions. They have to make an amd compatible version for the consoles so...yeah.

Still a dick move by nVidia (and I own a 2080ti). They must be using some special gameworks api they developed just for themselves to use which means nVidia paid the developers to include it and the developers don't have access to the code so they can't optimize it... Just like the tesselation issue in witcher 3.

Guess I won't be doing my first cyberpunk run with ray tracing because (if I can get one) I'm buying a 6900xt to take advantage of the performance boost when using 3rd gen ryzen.
 
That combo performance boost can't be ignored
What combo performance boost? Even the stickers aren't the same color. Instead you should get them color-coordinated, with a green GPU and a green CPU ;).
CDPR will likely update the game with support for AMD using dx12 because they are already adding it to the consoles therefore it must work on amd hardware.
Hopefully it will be that straightforward for them, but since AMDs equivalent of DLSS is still on the drawing board, said port is likely to underperform either in terms of visuals or framerate, if not both.
 
CDPR will definitely update the game to use regular dx12 RT functions. They have to make an amd compatible version for the consoles so...yeah.

Still a dick move by nVidia (and I own a 2080ti). They must be using some special gameworks api they developed just for themselves to use which means nVidia paid the developers to include it and the developers don't have access to the code so they can't optimize it... Just like the tesselation issue in witcher 3.

Guess I won't be doing my first cyberpunk run with ray tracing because (if I can get one) I'm buying a 6900xt to take advantage of the performance boost when using 3rd gen ryzen.
The game is already using "regular dx12 RT functions." You misunderstand what a middleware is.
 
The game is already using "regular dx12 RT functions." You misunderstand what a middleware is.

Please explain it. Because honestly at this point when it comes to RT I have no effing clue.
 
Please explain it. Because honestly at this point when it comes to RT I have no effing clue.
Being a middleware means it is a set of optimized functions and code paths used by standard APIs to achieve a certain desired outcome. The code was developed by NVIDIA engineers to use DirectX to achieve optimal ray traced effects for NVIDIA hardware. It's why another middleware NVIDIA develops, GameWorks, can run on AMD hardware but is slower in comparison. In other words, games that use RTX for ray tracing effects already can support ray tracing on other hardware because the underlying API is hardware agnostic.
 
I am all for open standards and history has proven that those win out over time.
yeah, like opengl, cl, xl, lol

No, wait...

I'm still waiting for Vulkan to take over the world (and it should...)
 
yeah, like opengl, cl, xl, lol

No, wait...

I'm still waiting for Vulkan to take over the world (and it should...)
Same. Unfortunately Vulkan Ray Tracing isn't out, yet.

Microsoft sabotaged OpenGL, which is why it didn't gain traction in games despite often being superior to Direct3D in many ways. My professor for OpenGL at university was Richard Wright, who represented Real 3D on the ARB back in the day, and he always told his incoming classes the story about Microsoft vs. ARB and OpenGL.
 
NVIDIA is using DXR and Vulkan Ray Tracing just like AMD. "RTX" is just branding for their middleware that takes advantage of NVIDIA-specific hardware.
I think vulkan RT isn't finished yet. Which is why Nv had to create its own extensions like they did with Ogl on other technologies. AMD could do their own or wait till vulkan rt is finished
 
Same. Unfortunately Vulkan Ray Tracing isn't out, yet.

Microsoft sabotaged OpenGL, which is why it didn't gain traction in games despite often being superior to Direct3D in many ways. My professor for OpenGL at university was Richard Wright, who represented Real 3D on the ARB back in the day, and he always told his incoming classes the story about Microsoft vs. ARB and OpenGL.
How did MS sabotaged Ogl? I'm geniunely asking.

I think Ogl was pretty good but failed to quickly adopt new technologies, hence the need for extensions. It was better and faster than DX but again lack of support and features made it quickly fade out from PC space.
 
I'm still waiting for Vulkan to take over the world (and it should...)
At least in terms of having support for it, since it's so close to DX12, and forms a basis for Android and desktop Linux support.
I think Ogl was pretty good but failed to quickly adopt new technologies, hence the need for extensions. It was better and faster than DX but again lack of support and features made it quickly fade out from PC space.
OpenGL was still hugely focused on commercial work, IIRC, to the point of neglecting the gaming / real-time graphics side of things. And I'm not sure it would have mattered. Once Microsoft went feet-first into gaming, there was little reason for vendors to try to maintain an alternative.
I am all for open standards and history has proven that those win out over time.
The 'open' standards usually arise after proprietary implementations 'show the way'. And they don't always win out, and aren't always better.

Sometimes 'open' just means unfocused, with a variety of implementations that don't necessarily work together and have many 'poor' examples, such as FreeSync, which is now being standardized by Nvidia, OpenGL which was never a good fit for gaming and has been abandoned (but was also best supported by Nvidia, especially on Linux!), and speaking of Linux, every attempt at a Linux 'desktop' so far.

Sometimes people just want something that will actually get the job done. Sometimes they actually, god forbid, want the best tool for the job!

Many times it takes a leader to get that done. Not all the time; Linux, for example, has taken over the world, and I think it's only a matter of time before Microsoft ports their desktop to the Linux kernel. But it would take someone like Microsoft to actually do that and force some standardization in the stack before it will really be useful across the broad range of end-user applications.
 
How did MS sabotaged Ogl? I'm geniunely asking.

I think Ogl was pretty good but failed to quickly adopt new technologies, hence the need for extensions. It was better and faster than DX but again lack of support and features made it quickly fade out from PC space.
Microsoft threatened to not support OpenGL at all in their operating systems if ARB started advertising and marketing it to game companies because they wanted their own API to become the de facto standard in 3D accelerated real-time graphics. It's why OpenGL was always associated with CAD and other similar productivity software during that time.

The only time OpenGL really lagged was when the pixel shader pipeline was developed. Microsoft was the first to market with a viable model in DirectX 8, while OpenGL only had fundamental hardware-specific extensions for pixel shaders until the release of GLSL in 2004. Even then, most implementations were using NVIDIA-specific extensions, which is why games like Doom 3 ran horribly on ATi hardware that fell back to the generic fixed functions. By the time OpenGL 4.0 was released in 2010 it had achieved near-feature parity with Direct3D.
At least in terms of having support for it, since it's so close to DX12, and forms a basis for Android and desktop Linux support.

OpenGL was still hugely focused on commercial work, IIRC, to the point of neglecting the gaming / real-time graphics side of things. And I'm not sure it would have mattered. Once Microsoft went feet-first into gaming, there was little reason for vendors to try to maintain an alternative.

The 'open' standards usually arise after proprietary implementations 'show the way'. And they don't always win out, and aren't always better.

Sometimes 'open' just means unfocused, with a variety of implementations that don't necessarily work together and have many 'poor' examples, such as FreeSync, which is now being standardized by Nvidia, OpenGL which was never a good fit for gaming and has been abandoned (but was also best supported by Nvidia, especially on Linux!), and speaking of Linux, every attempt at a Linux 'desktop' so far.

Sometimes people just want something that will actually get the job done. Sometimes they actually, god forbid, want the best tool for the job!

Many times it takes a leader to get that done. Not all the time; Linux, for example, has taken over the world, and I think it's only a matter of time before Microsoft ports their desktop to the Linux kernel. But it would take someone like Microsoft to actually do that and force some standardization in the stack before it will really be useful across the broad range of end-user applications.
The perception of OpenGL not being good for gaming was created by Microsoft. As I say above, it really wasn't until pixel shaders were developed that it lagged behind Direct3D. I think one of the best examples of the contrast between a game using OpenGL and Direct3D was the original Half-Life release. The difference between how that game looked and ran between using DirectX 7 and OpenGL was stark in the latter's favor.
 
Microsoft threatened to not support OpenGL at all in their operating systems if ARB started advertising and marketing it to game companies because they wanted their own API to become the de facto standard in 3D accelerated real-time graphics. It's why OpenGL was always associated with CAD and other similar productivity software during that time.

The only time OpenGL really lagged was when the pixel shader pipeline was developed. Microsoft was the first to market with a viable model in DirectX 8, while OpenGL only had fundamental hardware-specific extensions for pixel shaders until the release of GLSL in 2004. Even then, most implementations were using NVIDIA-specific extensions, which is why games like Doom 3 ran horribly on ATi hardware that fell back to the generic fixed functions. By the time OpenGL 4.0 was released in 2010 it had achieved near-feature parity with Direct3D.

The perception of OpenGL not being good for gaming was created by Microsoft. As I say above, it really wasn't until pixel shaders were developed that it lagged behind Direct3D. I think one of the best examples of the contrast between a game using OpenGL and Direct3D was the original Half-Life release. The difference between how that game looked and ran between using DirectX 7 and OpenGL was stark in the latter's favor.

Hadn't heard about that one. I do vaguely recall that MS was to drop opengl support on Vista or Win7.

Thing is that for years opengl lagged even in Linux. AMD terrible drivers didn't help either. Even Android support was lagging, there's a reason nvidia remained top performance for so long (using extensions none the less...)

Back in the quake days, OpenGL was the renderer of choice. Id, valve and epic were big supporters. I would always choose ogl with the Quake, Unreal y Half-life series. But a few years later only Id remained using it (probably when shaders came out)
 
Hadn't heard about that one. I do vaguely recall that MS was to drop opengl support on Vista or Win7.

Thing is that for years opengl lagged even in Linux. AMD terrible drivers didn't help either. Even Android support was lagging, there's a reason nvidia remained top performance for so long (using extensions none the less...)

Back in the quake days, OpenGL was the renderer of choice. Id, valve and epic were big supporters. I would always choose ogl with the Quake, Unreal y Half-life series. But a few years later only Id remained using it (probably when shaders came out)
Good info in this thread. Puts into more detail what I am trying to relate from what was shared with us at university. There was a lot of animosity apparent when Richard Wright talked about it, but there is truth in it. Microsoft left ARB in 2003 because they were no longer interested in collaborating with the board. They would take their own initiative to further develop the DirectX API by working with the industry in their own terms.

https://www.overclockers.co.uk/forums/threads/a-brief-history-of-opengl.18573678/
 
Become a Patron!
Back
Top