LazyGamer
FPS Junkie
- Joined
- Sep 5, 2020
- Messages
- 2,674
- Points
- 113
If there was any design target, it would have been RTX because that's all anyone had for years. Which means that the issue was AMD not designing their hardware and drivers to meet the needs of games, something both companies have failed to do on occasion when a new technology becomes available.It is obvious that the developers designed around RTX hardware and not Amds for RT.
To be fair to AMD, their first-gen RT is probably better than Nvidia's 2000-series.
If the blame is focused on the denoisers, then wouldn't it be that AMDs denoiser is less efficient?Unless you want to count the specifically written Nvidia denoiser for ray tracing that they have created. Not sure what magic this works behind the scenes in coding but it is nvidia specific to take advantage of specifically gated hardware. Not that I'm against that just saying it isn't all roses.
Noted here: https://www.realtimerendering.com/raytracing.html
And also on nvidia's page. Here is the quote from the source.
"Denoising is critical for real-time DXR performance when using path tracing or other Monte Carlo techniques. Alain Galvan's summary posts on ray tracing denoising and machine-learning denoising are good places to start. Zwicker et al. give a state of the art report about this area; note that it is from 2015, however, so is not fully up to date. Intel provides free code in their Open Image Denoise filter collection. The Quake II RTX demo includes shader code for the A-SVGF filter for denoising. NVIDIA has a developer access program for their denoiser. "
I don't really see the big deal here. Nvidia had been working to find a way to bring RT to real-time consumer graphics for near a decade and brought their solution to market a few years earlier, and it's very reasonable for games to appear to 'favor' the platform that was actually available during their development.
You might recall how painful the release of DX9 was for Nvidia