Alleged AMD RX 7900 XTX Geekbench Scores Show It As Fifteen Percent Faster than the GeForce RTX 4080 in Vulkan

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
9,032
Points
113
While we are still a week away from the official launch of AMD's latest flagship graphics card, PC enthusiasts are anxiously awaiting reviews for it and verified comparison tests between the two and this could be a taste of things to come.

See full article...
 
We already know that the RTX 40 series are very stupidly overpriced but for AMD to tax close to NVIDIA's already gouged price tag, the flagship (@ $1000) doesn't just need to trade blows with the 4080, it needs to OUTPERFORM the 4080 in all metrics (not a few titles here and not so many there fake news) by a minimum of 15%+ vs the 4080 to earn its place as the 2nd most powerful GPU on the market.
 
I don't think the 7900XTX will completely beat the RTX4080 on all fronts, specially if you consider DLSS3.0, but I'm sure it will beat it an rasterization and compute.

But even if its not a RTX4080 killer it will still force nvidia to lower its price.
 
I don't think the 7900XTX will completely beat the RTX4080 on all fronts, specially if you consider DLSS3.0, but I'm sure it will beat it an rasterization and compute.

But even if its not a RTX4080 killer it will still force nvidia to lower its price.
Those are exactly my thoughts as well. I did a story not too long ago about the guy who convinced AMD to move to a chiplet design with these cards.

https://www.thefpsreview.com/2022/1...-explains-advantages-of-rdna3-chiplet-design/

If these XT and XTX cards do well it will be a good sign of things to come with them being the first out of the gate. It could be the answer AMD needs to get around the DLSS option w/o creating their own proprietary solution and having to get developers onboard. In turn that might not just get around DLSS hardware but also sink it.

It seems like the best they can ever do each gen is 2nd or 3rd place for GPUs but I admit there are other things to consider as well like price, power draw, size, and heat since NVIDIA has gone full-on bonkers with all of the above (but I give team green props for making a heatsink the size of a house to tame the ADA beast).
 
Those are exactly my thoughts as well. I did a story not too long ago about the guy who convinced AMD to move to a chiplet design with these cards.

https://www.thefpsreview.com/2022/1...-explains-advantages-of-rdna3-chiplet-design/

If these XT and XTX cards do well it will be a good sign of things to come with them being the first out of the gate. It could be the answer AMD needs to get around the DLSS option w/o creating their own proprietary solution and having to get developers onboard. In turn that might not just get around DLSS hardware but also sink it.

It seems like the best they can ever do each gen is 2nd or 3rd place for GPUs but I admit there are other things to consider as well like price, power draw, size, and heat since NVIDIA has gone full-on bonkers with all of the above (but I give team green props for making a heatsink the size of a house to tame the ADA beast).
I have my perks on the so called chiplet design, I mean everyone was expecting a MultiGPU chiplet, instead we got basically a single GPU with separate cache modules.

By the same token, Fury would have been the first chiplet, maybe even more deservedly so.

On the DLSS front, its here to stay. Even FSR2.1 is not up to the same IQ as DLSS 2.x. Also its officially suported in much less games than DLSS despite being open source. And no, hacks and mods don't count. To be fair most new games support both, but still...
 
Become a Patron!
Back
Top