Overclocked GeForce RTX 4070 SUPER vs Overclocked Radeon RX 7900 GRE Performance Comparison

Brent_Justice

Administrator
Staff member
Joined
Apr 23, 2019
Messages
911
Points
93
Introduction In this overclocking performance review, we are going to compare the NVIDIA GeForce RTX 4070 SUPER versus AMD Radeon RX 7900 GRE in gaming performance benchmarks when overclocked to the max to show their maximum performance potential. It will be interesting to find out when both video cards are overclocked, how they compare in […]

See full article...
 
Interesting review glad to see the cards trading blows more or less. Only standout is RT.

But... what does it say that a mid tier card is 600 bucks! OOf.
 
Interesting review glad to see the cards trading blows more or less. Only standout is RT.

But... what does it say that a mid tier card is 600 bucks! OOf.
RDNA 4 will comfortably beat the 4070 super. Priced around the same as 7900 GRE hopefully. Not sure when AMD wants to launch it tho
 
RDNA 4 will comfortably beat the 4070 super. Priced around the same as 7900 GRE hopefully. Not sure when AMD wants to launch it tho
...There'll be a '5070' of some sort coming too. The game of one upping never ends. Hopefully AMD will have their RT up to par with Intel and Nvidia at least.

But... what does it say that a mid tier card is 600 bucks! OOf.
That people are willing to buy them!
 
...There'll be a '5070' of some sort coming too. The game of one upping never ends. Hopefully AMD will have their RT up to par with Intel and Nvidia at least.


That people are willing to buy them!
Willing or Have No Choice?? I think the latter..
 
Hopefully AMD will have their RT up to par with Intel and Nvidia at least.
Will RT continue to be a thing now that AI is the latest “it”?

Seems like RT really only was there for nVidia to have a marketing point over AMD and a potential Intel rivalry, and a good excuse to raise the price up. But even today - nVidias talk is mostly all about DLSS and AI - they have totally shifted gears. Sure they still mention RT because they beat the pants off everyone else with it, but it’s mostly just become a marketing bulletpoint and not the item that gets slides and slides about it
 
Willing or Have No Choice?? I think the latter..
You're not forced to buy GPUs, so, 'willing' :)

Will RT continue to be a thing now that AI is the latest “it”?
Absolutely. 'AI' has almost no practical meaning for desktop users in general (until it does). Whereas we know that AMDs weak RT performance thus far has been holding back RT implementations in games, particularly due to the console effect.

Will RT continue to be a thing now that AI is the latest “it”?

Seems like RT really only was there for nVidia to have a marketing point over AMD and a potential Intel rivalry, and a good excuse to raise the price up. But even today - nVidias talk is mostly all about DLSS and AI - they have totally shifted gears. Sure they still mention RT because they beat the pants off everyone else with it, but it’s mostly just become a marketing bulletpoint and not the item that gets slides and slides about it
DLSS is still huge, and represents a significant advantage in overall refinement as well as having broad support. I see the level of support evening out over time of course, but it's still something that Nvidia has simply developed further and anything else represents a compromise in either performance (XeSS either not running on an Intel GPU, or in having to run an Intel GPU...), or FSR just being inferior.

For AI... marketing buzzwords and SEO are going to dominate marketing, reason be darned. Local 'AI' hardware is likely to be useful in augmenting LLMs for those that use them but we're still waiting for those 'killer apps' to make a case for general consumer investment. Right now, it seems to me that these tech giants are mostly just capitalizing on fervor and not wanting to be seen as being behind in the new hotness.

I really cannot see local AI hardware today being of the same importance to gaming as RT and smart upscaling and frame generation technologies. Maybe tomorrow?
 
You're not forced to buy GPUs, so, 'willing' :)
You're not forced to buy a GPU, but if you want a GPU, and you want a mid-tier card, you're forced to spend $500+ for it. Unless you settle for a much lesser powered card in the $200-$300 range..
 
You're not forced to buy a GPU, but if you want a GPU, and you want a mid-tier card, you're forced to spend $500+ for it. Unless you settle for a much lesser powered card in the $200-$300 range..
You can always make compromises elsewhere, like your monitor and/or resolution, for the same performance on a lower end/cheaper card. But I think you can still play most games at 1440p/60 with high or near maxed settings on a $300-$400 card these days. The numbers on this review on the RT slides are completely maxed out with RT, and that doesn't make sense for most people anyways if you care about optimization where you can get 95% of the IQ with 150% of the performance if you tweak the right settings.
 
RDNA 4 will comfortably beat the 4070 super. Priced around the same as 7900 GRE hopefully. Not sure when AMD wants to launch it tho
I thought RDNA 4 was to match the RTX4080, anyway that's not what it's going to compete with.

But if its priced right, it will force nvidia to price their cards accordingly, consumer win!!!.
 
You're not forced to buy GPUs, so, 'willing' :)


Absolutely. 'AI' has almost no practical meaning for desktop users in general (until it does). Whereas we know that AMDs weak RT performance thus far has been holding back RT implementations in games, particularly due to the console effect.


DLSS is still huge, and represents a significant advantage in overall refinement as well as having broad support. I see the level of support evening out over time of course, but it's still something that Nvidia has simply developed further and anything else represents a compromise in either performance (XeSS either not running on an Intel GPU, or in having to run an Intel GPU...), or FSR just being inferior.

For AI... marketing buzzwords and SEO are going to dominate marketing, reason be darned. Local 'AI' hardware is likely to be useful in augmenting LLMs for those that use them but we're still waiting for those 'killer apps' to make a case for general consumer investment. Right now, it seems to me that these tech giants are mostly just capitalizing on fervor and not wanting to be seen as being behind in the new hotness.

I really cannot see local AI hardware today being of the same importance to gaming as RT and smart upscaling and frame generation technologies. Maybe tomorrow?

Well, MS is betting big on AI on the desktop, so nvidia will surely try to grab a large chunk of that market.
 
Become a Patron!
Back
Top