The next battle ground for video cards... Ray Tracing and other next gen features.

Grimlakin

Forum Posting Supreme
Joined
Jun 24, 2019
Messages
8,945
Points
113
Now that the market is warming up to RT in games I expect to see a real race to the performance coming. Just curious what that will look like. If AMD can swing getting the lead for a bit on the RT market AND the CPU market at the same time that will put them in an AMAZING position for gaming systems.

A full AMD driven gaming system. Something Nvidia has never been able to achieve. What do you all think?
 
I’d rather them push on traditional raster performance than RT, as faster RT doesn’t net you total overall performance right now.

For me, I don’t care if my PC is all AMD or not, I want the best components for my money - whatever that may be.

I ~am~ very excited for AMD right now, not because I am Team Red, but because the PC hardware space has been stagnant for too long and AMD is finally brining about some meaningful competition, at least in the CPU space.

They have been more competitive GPU wise, although they haven’t taken the performance crown any time recently. I do hope to see them gain more ground on the GPU front - but again, not because I want an all-AMD system, but because I think nVidia is taking advantage of lack of competition right now and we are seeing that front stagnate as well.
 
I agree with you completely Brian_B. Competition is completely healthy for all of us. Once we get some of that going in the video card space maybe we will see the kind of price competition we are seeing inn the CPU market.

Video card performance before the last 5 years has been the domain of home user, studio, and CAD systems. Now with the virtualized slicing of video cards the big bread and butter for companies is in the hypervisor hosted video card solution. (we're talking 10k video cards easy here.)

With the embracing of cloud gaming, and cloud systems via VDI, we will see this becoming more and more the norm. I just want them to keep making good consumer cards too!
 
A full AMD driven gaming system. Something Nvidia has never been able to achieve.
I'd love to do one just for the sake of it. Been NV/Intel for so long I really want a change for the better. Especially now that I've got a 3700x of my own and really loving it. Just not sure how long it will take for AMD to compete on a 2080TI level, or even 2080 Super level. If they can beat or equal a 2080TI then I'd probably get one for sure just to see the PCIe 4.0 metrics at 4k/120hz/HDMI 2.1.

It is interesting, though, how much coverage ray tracing got this year. Not all new features get the same. I got on board with PhysX back in the day and that fizzled. Sure it's still around but ever since they got rid of hardware acceleration it's about as useful as hairworks. There's a good thread on it on the [H] forums. I do, however, think RT is a great way to go. A shame it's so hardware demanding. After getting a number of games with it I've grown to know what to look for in games that don't. Some are indistinguishable but just as demanding so that speaks to what is needed regardless of implementation. I've done extensive testing with RDR2 and now doing a play through of Mass Effect Andromeda along with another on ROTTR. All three do not feature RT but all three can punish even a 2080TI at max settings in 4k and all three have impressive IQ but lack RT.
 
RT is great when time is available to do the simulated light model but for real time it is very weak for gaming usage other than enhancing rasterization calculations where needed.

A low quality RT image is around 100-200 samples/rays per pixel and has to be denoised. The 2080Ti can do 20samples/sec at 60fps at 4K - impressive! but nowhere near the ability to do even low quality real time ray trace lighting. Add ray bounce into the calculation, as in 1 bounce gives you 1/2 of the above. 2 bounce 1/4. For any kind of real time ray trace image 2 bounces is about the minimum for color bleed and global illumination and to really do that justice, 1000samples+.

So even if Nvidia 10x the RT capability of their GPU's, rasterization would still be needed for the most part.

Now what is very interesting and Nvidia has been doing a lot of R&D on this as well with open source code is AI rendering. To me that holds much more promising results. Using RT and effective AI rendering could solve the problem for more realism, easier development of future games that would just blow us away. The different parts for Turing for RT I don't think came together effectively, have to see what Ampere has as well as what AMD can do. Looks like for the near future RT will be used to help give a better model for calculations for Rasterization. Further on, can be radically different then what we have now.
 
Last edited:
Today's consumer grade RTRT seems like it's just a "oooh, shiny (and reflecty)" tech demo, rather than a feature set that stands above.

My gut tells me that real-time ray tracing hardware for the consumer segment is in such a stage of infancy that it will be at least another 5 years until it's matured enough to be *start* to be worthwhile. It's also going to take game engine devs a lot more time to modify existing or create new game engines that will render RTRT efficiently and realistically...and time on top of that for game devs to effectively implement the capabilities of those game engines.

Reminds me of when the first SM3.0 (DX9.0c) games emerged, which resulted in the biggest complaint being that all it did was make things look wet and shiny. But then it matured over the years while game devs really started to utilize it properly, and the results were great. Similar to DX12 and SM 6.x implementations over the years.

I'm going to anxiously follow the growth of Vulkan 1.2 (and beyond) to see how the very first games utilize it compared to games 2-5 years down the road, after the learning curve is climbed.

That being said, I say bring it on, nVidia, AMD, and Intel! The more competing wrenches thrown into the continued development of it, the better it will become in a lesser amount of time.
 
Today's consumer grade RTRT seems like it's just a "oooh, shiny (and reflecty)" tech demo, rather than a feature set that stands above.

My gut tells me that real-time ray tracing hardware for the consumer segment is in such a stage of infancy that it will be at least another 5 years until it's matured enough to be *start* to be worthwhile. It's also going to take game engine devs a lot more time to modify existing or create new game engines that will render RTRT efficiently and realistically...and time on top of that for game devs to effectively implement the capabilities of those game engines.

Reminds me of when the first SM3.0 (DX9.0c) games emerged, which resulted in the biggest complaint being that all it did was make things look wet and shiny. But then it matured over the years while game devs really started to utilize it properly, and the results were great. Similar to DX12 and SM 6.x implementations over the years.

I'm going to anxiously follow the growth of Vulkan 1.2 (and beyond) to see how the very first games utilize it compared to games 2-5 years down the road, after the learning curve is climbed.

That being said, I say bring it on, nVidia, AMD, and Intel! The more competing wrenches thrown into the continued development of it, the better it will become in a lesser amount of time.
I mostly agree. I bought around 3-6 games that support RT. Shadow of the Tomb Raider really is something to experience. Metro Exodus is my 2nd favorite and then Control. Quake of course is amazing but pounds even the fastest cards right now which also totally supports what you said. I will say having the right display goes hand in hand with it. As much as I love my 27" 1440p/144Hz/G-Sync monitor it's virtually useless when I want to appreciate RT in games. Put something on either the Z9D or C3 and the wow factor reaches another level. I mainly bought my 2080TI because I was escaping SLI and not backing down on 4K gaming. It's mostly delivered on that and been happy to get a taste of what RT is. Recently got the 2080 Super because I wanted to enjoy G-sync for the C3 while not breaking the bank for my latest build. Been really happy with it. For top dollar value a 2070 Super is still best bet but the 2080 Super, when overclocked, creeps into reference 2080TI performance territory. Doesn't hurt I got it for a good $750 BF deal either. Pretty sure I'll be holding out for a 5950 for my next card though to take that rig to true 4k performance level.

I totally agree. Bring it on everyone! PC gaming seems to be awesome lately and I think we're all loving it.
 
Become a Patron!
Back
Top