NVIDIA GeForce RTX 5080 Founders Edition Video Card Review

Brent_Justice

Administrator
Staff member
Joined
Apr 23, 2019
Messages
996
Points
93
Introduction The NVIDIA GeForce RTX 5080 is NVIDIA’s next-generation graphics card featuring the new Blackwell architecture with an MSRP of $999. The GeForce RTX 5080 is the second GPU in NVIDIA’s new RTX 50 Series lineup, a tier-down from the flagship enthusiast GeForce RTX 5090, aimed at providing improved gaming performance with new technologies like DLSS 4, Neural Shaders, and Mega Geometry and Multi-Framegen. Today we are going to review the NVIDIA GeForce RTX 5080 Founders Edition with a focus on the gameplay experience, with some 4K raster performance, 1440p Ray Tracing performance, as well as DLSS Upscaling performance. The […]

See full article...
 
Long story short get an RTX4080 while you can.

No surprise here, given the specs, I didn't expect the RTX 5080 to do any better.

The RTX5070Ti hopefully will be a much better option as it's supposed to be right behind the RTX4080.

I'm very disappointed, I can't remember the last time a next gen x80 card wouldn't beat or at least meet the next up tier.
 
I remember when I got my 1070 that it was supposed to be on par with the 980ti
 
Looks like 5080 is a dud. I hope people read reviews before buying them on ebay for hundreds more. Maybe holding on to the 4090 is a good idea. But, the 4090 just increased in resale value!
 
So why is the 5090 not anywhere twice as fast as the 5080? what's holding it back?
 
So why is the 5090 not anywhere twice as fast as the 5080? what's holding it back?
Honestly, I'd love to see some deep dive on if the 5090 is being constrained by CPU. I've seen some side mentions on a couple sources saying it seems that way but no deep dive.
 
Reminds me of the 1000 series to 2000 series jump, only without the DLSS carrot there to give you a reason to bite.
 
On a side note, I've seen a few posts claiming that the new transformer DLSS mode is much more improved quality than the old one. So much as DLSS balanced mode being indistinguishable from quality mode and even performance mode being much better than before.
 
Honestly, I'd love to see some deep dive on if the 5090 is being constrained by CPU. I've seen some side mentions on a couple sources saying it seems that way but no deep dive.
That should be pretty easy to test with time available.

System 1: Ryzen 9700X Down clocked, or even just set to 65w
System 2: Ryzen 9800X3D pushed as high as you can go with clock

run both systems at 4k and see if there is any significant difference
 
I mean it's a BIT more than that. You might want to use Intel's monitoring tools to see how things are working and better project where there is a bottleneck between the platforms for differences.
 
Honestly, I'd love to see some deep dive on if the 5090 is being constrained by CPU. I've seen some side mentions on a couple sources saying it seems that way but no deep dive.

That's an interesting idea, give me some ideas (examples) of what to test to accomplish this. I also think it is true that the 5090 is being held back by the CPU, I said as much in my review.
 
That should be pretty easy to test with time available.

System 1: Ryzen 9700X Down clocked, or even just set to 65w
System 2: Ryzen 9800X3D pushed as high as you can go with clock

run both systems at 4k and see if there is any significant difference

Thanks for that suggestion, as I said above, I personally think that is a great idea to test. Right now, something like that wont be soon, but I've got it written down in my potential reviews doc.
 
That's an interesting idea, give me some ideas (examples) of what to test to accomplish this. I also think it is true that the 5090 is being held back by the CPU, I said as much in my review.

It would need to be basic. at 4k whatever refresh is the GPU at 100% while the CPU is running at max ghz high utilization across cores? That's how I would logically think a held back device would look.


But it would need to be a slim test.. like perhaps raster only... I'm not sure on that.

Jay has a video on how to use the intel tool to track and see if the CPU or GPU is throttling. *shrugs*. It seems kind of esoteric to me. All the info I've seen calling out CPU throttling has been at higher res where NORMALLY that would be at lower rests... so it's interesting.


Presentmon is the tool that you can use... just a idea. Jays video has some info on how to use it but there are better tutorials online.
 
Gotta wonder if a cutdown GB202 die is on the table for a $1499 5080 Ti w/ 24 GB VRAM.
 
It appears the 5080 is a good overclocker. If that is consistent across a bunch of samples then it's a great candidate for water cooling to open up some possible additional headroom.
 
Good point. A lot of folks are complaining about how close this is to a 4080 SUPER but things could get really interesting when OC'd and liquid cooled since you don't have to worry about hitting a 600W power limit for the cable.
 
Good point. A lot of folks are complaining about how close this is to a 4080 SUPER but things could get really interesting when OC'd and liquid cooled since you don't have to worry about hitting a 600W power limit for the cable.
J2C video on it shows another 12'ish percent gain when overclocked. That would mean roughly 24% generational improvement. Which is what we'd typically expect.
 
J2C video on it shows another 12'ish percent gain when overclocked. That would mean roughly 24% generational improvement. Which is what we'd typically expect.
Sorta....

15-25% generational improvement is I have more or less come to expect, before overclocking though.

Overclocking, I've always seen as icing on the cake. If I can get it, great. If I don't, well, it wasn't guaranteed.

5080 is in a bad spot right now. I mean, at least it doesn't cost ~more~ than a 4080 did. And it is a valid upgrade path for folks that are on older cards. But it doesn't push a compelling story to upgrade. "Here's this new card, it's the same as the old card only supports more proprietary stuff that is cool in a handful of games with fake frames."

Ideally, they would have just rebadged it as a 4080 TI Super Ultra, and everyone would have been thrilled! 40W lower TDP and supports the latest DLSS, for same price! Or shift the name and call this the 5070 Ti (but that means you have to figure out pricing, although they shifted up 5090, why not slot this up to the $1k slot, and slide a different 5080 bin in between -- I mean, why not, nVidia buyers aren't real sensitive to price brackets.

But you name it 5080, and people are looking for generational improvements. And it's ... slim there.
 
Sorta....

15-25% generational improvement is I have more or less come to expect, before overclocking though.

Overclocking, I've always seen as icing on the cake. If I can get it, great. If I don't, well, it wasn't guaranteed.

5080 is in a bad spot right now. I mean, at least it doesn't cost ~more~ than a 4080 did. And it is a valid upgrade path for folks that are on older cards. But it doesn't push a compelling story to upgrade. "Here's this new card, it's the same as the old card only supports more proprietary stuff that is cool in a handful of games with fake frames."

Ideally, they would have just rebadged it as a 4080 TI Super Ultra, and everyone would have been thrilled! 40W lower TDP and supports the latest DLSS, for same price! Or shift the name and call this the 5070 Ti (but that means you have to figure out pricing, although they shifted up 5090, why not slot this up to the $1k slot, and slide a different 5080 bin in between -- I mean, why not, nVidia buyers aren't real sensitive to price brackets.

But you name it 5080, and people are looking for generational improvements. And it's ... slim there.
I'm grabbing one tomorrow. Any way I slice it it'll be a massive improvement over my 2080 Ti. The latest gen NVENC encoders is icing on the cake for me. I'm also going to put it on water, and if needed overclock it.

I wish reviewers would including encoder performance from gen to gen. I know it's a hard metric to track, but it would be nice.
 
Become a Patron!
Back
Top