GeForce RTX 4080 (16 GB) Delivers Up to 30% Faster Gaming Performance Than 12 GB Model, according to NVIDIA Benchmarks

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,255
Points
83
NVIDIA has released a set of benchmarks that provide an early idea of how its GeForce RTX 4080 graphics cards might differ from one another.

Go to post
 
Ok so the graph is ... interesting.

I'm going to go out on a limb and say the lesser 4080 roughly matches the current 3090 (non-Ti).

I can find a 3090 right now for less than the $899 MSRP of the 4080 Lesser Edition, and I can get that right now. Sure, you don't get DLSS 3 (or will you?), and I bet the Ada will probably still raytrace a bit better, but you do get double the VRAM...

Trying to decide how I feel about that. On one hand, it's nice that the performance has jumped as it has. On the other... the prices jumped right along with it... so I don't feel like we really gained anything on the price/performance curve on this generation - just performance.

And sure, that doesn't sound like such a bad tradeoff, especially if you agree that Moore's Law is dead and we should be seeing the prices go up; that price vs performance is a meaningless metric. But I would say - what about the mid and low tier cards that just didn't exist last generation. If the new 4070 starts out at $899, and that used to be a $350 tier relative to performance in a given generation; have we just priced out a large part of our already niche marketplace?

1665624866290.png
 
If you want to stream or if that enters a value proposition for you full av1 encoders are nice on the Nvidia cards.
 
If you want to stream or if that enters a value proposition for you full av1 encoders are nice on the Nvidia cards.
Yeah, but you can get one of those on an Arc for ... considerably less, and just use that as a second card just for compression. If you trust Intel drivers to work, that is.
 
If I were to dare (well, I use to Heli-skiing and bungee jump IRL before the family 😂) I would say that Zotac is the next EVGA-like forced out. 😱
 
If you want to stream or if that enters a value proposition for you full av1 encoders are nice on the Nvidia cards.
Might be cheaper to get an AMD AM5 CPU... I'd say wait for 13th-gen, but that isn't going to get new IGP tech. Maybe Intel will have it for 14th-gen.

Yeah, but you can get one of those on an Arc for ... considerably less, and just use that as a second card just for compression. If you trust Intel drivers to work, that is.
Pretty much coming to this conclusion as well. I do think that Intel's transcoding drivers work, at least if they work as well as they do on the IGPs.
 
Ok so the graph is ... interesting.

I'm going to go out on a limb and say the lesser 4080 roughly matches the current 3090 (non-Ti).

I can find a 3090 right now for less than the $899 MSRP of the 4080 Lesser Edition, and I can get that right now. Sure, you don't get DLSS 3 (or will you?), and I bet the Ada will probably still raytrace a bit better, but you do get double the VRAM...

Trying to decide how I feel about that. On one hand, it's nice that the performance has jumped as it has. On the other... the prices jumped right along with it... so I don't feel like we really gained anything on the price/performance curve on this generation - just performance.

And sure, that doesn't sound like such a bad tradeoff, especially if you agree that Moore's Law is dead and we should be seeing the prices go up; that price vs performance is a meaningless metric. But I would say - what about the mid and low tier cards that just didn't exist last generation. If the new 4070 starts out at $899, and that used to be a $350 tier relative to performance in a given generation; have we just priced out a large part of our already niche marketplace?

View attachment 1939
I was actually looking into buying a 3090 just now, but decided I don't really need it. For $870 I'd probably change my mind but it still costs significantly more here in Europe.
 
Ok so the graph is ... interesting.

I'm going to go out on a limb and say the lesser 4080 roughly matches the current 3090 (non-Ti).

I can find a 3090 right now for less than the $899 MSRP of the 4080 Lesser Edition, and I can get that right now. Sure, you don't get DLSS 3 (or will you?), and I bet the Ada will probably still raytrace a bit better, but you do get double the VRAM...

Trying to decide how I feel about that. On one hand, it's nice that the performance has jumped as it has. On the other... the prices jumped right along with it... so I don't feel like we really gained anything on the price/performance curve on this generation - just performance.

And sure, that doesn't sound like such a bad tradeoff, especially if you agree that Moore's Law is dead and we should be seeing the prices go up; that price vs performance is a meaningless metric. But I would say - what about the mid and low tier cards that just didn't exist last generation. If the new 4070 starts out at $899, and that used to be a $350 tier relative to performance in a given generation; have we just priced out a large part of our already niche marketplace?

View attachment 1939
Yep, I posted about this one in another thread on the 1st day of prime early access. Yesterday I even saw a Zotac Amp Extreme 3090 Ti for $999.
 
The lesser model really should have been a 4070.

It's moronic of them to muddy the waters with these stupid confusing model sames.
I disagree, the lesser model is a 4060ti and the top one is a 4070 in disguise.

Seems they left a lot of room for better 4080's, as to be able to react to RDNA3 or something.
 
4070 in disguise, well that's what you are...
A rasterization pie with your brand new card
Reflections processors but not as many as you should
4070 in disguise, and overpriced...
 
But then they'd be selling a x070-series card for US$899, so... :)
Another part of that is:
I don't know that nVidia wanted to position a x70 as roughly equivalent to the previous generation x90 (or what was formerly known as Titan).

It was a big jump in performance, and nVidia wants the naming to reflect the relative pricing, not performance inside the generation. Unfortunately, that means the price/performance ratio delta between generations we are used to seeing just got thrown out the window. My fear as that the lower end of the spectrum has already been entirely eliminated from the RTX spectrum, and if they are shifting all the naming up, we may see that middle tier completely dissolve as well.
 
Yea... I didn't pounce on a 4090 but I don't expect availability to be as tight as covid years. Plus I want to wait to see the competition though with the leap... yea I don't have a bunch of high hopes. I just know if I get a 4090 that will necessitate a 4k monitor to get within performance specs.
 
You beat me to it by seconds! :D

"The RTX 4080 12GB is a fantastic graphics card, but it’s not named right. Having two GPUs with the 4080 designation is confusing."

All I can think to say about the news from NVIDIA is "priceless".
Better they do it than the entire community ridicule them for it.
 
Better they do it than the entire community ridicule them for it.
The entire community already was ridiculing them for it, which I assume played a role in their decision to "unlaunch" the card.

The question open to speculation is how they choose to rebrand the card. It should be an RTX 4060 Ti at best, in my opinion, but I don't really care what they call it. I didn't even care about the original name, to be honest.
 
I disagree, the lesser model is a 4060ti and the top one is a 4070 in disguise.

Seems they left a lot of room for better 4080's, as to be able to react to RDNA3 or something.

There is nothing objective that slots a card into the xx60 vs xx70 etc. slot.

People make a lot out of GX-1xx chip numbers, but that is really irrelevant and has moved all over the place over the generations.

There really is no objective measure as to what chip should slot into what marketing name each generation.

- GF104 was a GTX 460
- GK104 was a GTX 660 through 690. All of them from xx60 and up were 104 that gen. And then GK104 was 760 and 770 in the next gen...
- There was no GM104, but GM204 (same thing, just second go at GM) was a 960 that gen
- GP104 was in many models that gen, the 1060's, the 1070, the 1070ti and the 1080
- TU104 was the 2070 Super, 2080 and 2080 Super that gen.

Its best to think of the different generations as completely separated from each other. The model numbers are qualitative, not quantitative. A higher number means faster than a lower number, but by how much? No guarantees. There are also no guarantees how they relate to similarly modeled cards of previous generations.

My only complaint is that it is confusing to release two different chips under the same name GPU.

It's not the first time they've done it though, but usually it's on mid to low end cards. There were like 3 different versions of the GTX460 and a fourth SE variant just for fun.. The 1030 came in two different versions with different types of RAM. Even just last gen there were two different versions of the 3080.

They need to stop this **** and have clearly deliniated products that don't confuse the marketplace.
 
Become a Patron!
Back
Top