NVIDIA Announces GeForce RTX 4090 ($1,599) and GeForce RTX 4080 16/12 GB ($1,199/$899) Graphics Cards: Up to 4X Faster, Powered by 3rd Gen RTX Arch...

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,871
Points
113
NVIDIA has announced the GeForce RTX 4090 and GeForce RTX 4080, the first graphics cards in the new Ada Lovelace-powered GeForce RTX 40 Series.

Go to post
 
I think what everyone is waiting to see are not features or performance, it's the power draw. How much juice are these things gonna suck outta the wall? Prices no better than last-gen, but thankfully no worse either. No mention of a 4070 yet I see. And only the 4090 has a release date, from what I have seen thus far.
 
I think what everyone is waiting to see are not features or performance, it's the power draw. How much juice are these things gonna suck outta the wall? Prices no better than last-gen, but thankfully no worse either. No mention of a 4070 yet I see. And only the 4090 has a release date, from what I have seen thus far.

GeForce RTX 4080 (16 GB)​

  • RTX 4080 (16 GB) Starting at $1,199.00
  • Twice as fast as the GeForce 3080 Ti, using nearly 10% less power
  • 9,728 CUDA Cores, 780 Tensor-TFLOPs, 113 RT-TFLOPs, 49 Shader-TFLOPs of power, and GDDR6X memory
 
The pricing keeps going up much faster than inflation each generation.

This industry is all messed up, A completely distorted market without real competition, and has been for 10 years.
I think what everyone is waiting to see are not features or performance, it's the power draw. How much juice are these things gonna suck outta the wall? Prices no better than last-gen, but thankfully no worse either. No mention of a 4070 yet I see. And only the 4090 has a release date, from what I have seen thus far.

I'd argue that the prices are worse. The market conditions arent there (extreme mining demand combined with supply shortages) anymore to drive up pricing the way they did. These prices should be coming down. IMHO, we ought to be at the 900 series leves, but adjusted for inflation.

In other words, a xx80 GPU for $699, a xx80ti for about $799 and a xx90/Titan level card for about $1,200.

I think they are doing this because they gobbled up most of the available production capacity from TSMC for this gen, leaving AMD with lower volumes than they'd hoped to work with, so if people are pissed and want to shop elsewhere, AMD may not be able to provide an alternative.

This is why a market needs 3-5 players to be truly competitive.

Last time, the prices were made extreme by everyone in the supply chain (AIB's, retailers, scalpers) taking their slice of the pie. This time Nvidia has decided they want it for themselves, and **** the fact that the market fundamentals don't support those prices anymore.

I keep hoping that at some point customers will be pissed enough and just not buy, but thus far the undisciplined ADHD hordes have never ceased to amaze me with just how undisciplined they can be, and how they keep sustaining this bad behavior from the industry by throwing money at them. If there is a single reason we can point towards for this, it is that pandemic and crypto pricing showed the industry that the kids are willing to pay **** near anything to get their hands on the latest GPU's, so they can just keep raising the prices and the customers will keep coming and paying more and more for the same thing (generationally adjusted)

This is typically what economists call "price inelasticity of demand", where demand - unlike what we usually expect - doesn't drop off when prices go up. Usually we only see it for absolute necessities (food, water, fuel) not for toys.
 
Last edited:
So, realistically, we can expect the 4080 12GB to be selling for $1500, the 16GB version to be $1900 and the 4090 to be somewhere around $2500.

Have to remember, Nvidia already announced that they are going to limit the supply of these cards initially. Which means any supply that does hit the market will get snatched up by scalpers, or will already be inflated at retail.
 
I wonder if DLSS 3 works on ampere/turing. Even if it works with reduced performance increase.
Seeing that FSR 2 brought parity on IQ/performance with DLSS 2 (with nvidia still slightly edging it on both fronts), I expected that nvidia would bring DLSS performance mode at quality mode. That seems to be the case.

The performance numbers on FS are awesome, can't wait to try it.

Now let's see what AMD brings to the table.

BTW nvidia is releasing the card just in time for my Bday ;);):p:p
 
This is typically what economists call "price inelasticity of demand", where demand - unlike what we usually expect - doesn't drop off when prices go up. Usually we only see it for absolute necessities (food, water, fuel) not for toys.
A good portion of this was mining - where you could always count on generating a profit, and the more you had the more you earned.

Hopefully those days are done. I think we will return to seeing a ceiling on what gamers will accept - there will always be a few big spenders out there, but I’m hoping it revitalizes the <$200 market and we see the $200-350 market pick back up to being mid-range rather than low range like is has been for the last couple of years.
 
A good portion of this was mining - where you could always count on generating a profit, and the more you had the more you earned.

Hopefully those days are done. I think we will return to seeing a ceiling on what gamers will accept - there will always be a few big spenders out there, but I’m hoping it revitalizes the <$200 market and we see the $200-350 market pick back up to being mid-range rather than low range like is has been for the last couple of years.

I have to concur with Zarathustra, nvidia realized during the last couple of years just how much gamers are willing to spend (not me, but still). Why let scalpers get all the fun? Besides since demand will be much lower now, nvidia has to compensate with higher prices. I don't like it of course, but that's just the way things are now.

BTW the days of sub $200 cards are long gone. If you really want a card capable of decent gaming, expect to pay at least $350. Maybe intel will find a place there.
 
I'm somewhat expecting, if there is no other market disruption, for prices to subside once 3000-series stock reaches end-of-life levels.

Profit can go up through reduction of costs, higher volume, or higher pricing. Production costs aren't going to go down significantly, so if Nvidia wants to keep up profits, they'll likely need to lower prices as the demand that drove up prices fades - and the volume simply won't be there if they don't.

Or, they could just give that marketshare to AMD and Intel.
 
If this is what the 4090 FEs are going for I can only imagine what the AIBs will be. I'm so glad that I'm skipping this gen.

It looks impressive on paper but I'm happy to wait a bit and focus on other projects like a PCIe5.0/DDR5 build with a CPU that's going to be the mainstay for a long time to come. Even it real-world testing shows slightly lower numbers it is still impressive. I'm also not getting suckered into buying what appears to be the top-tier card at launch again only for the one-better to come out later. I've made that mistake one time too many (back in the day I got my 970s by accident because I forgot I had planned to hold out for the 980 Ti, and then got the 3090, and had fun with it, because I thought it would be the Ti/Titan for that gen). Better to wait and see how things look a year or so after.

Wake me when the 4090 Ti/Titan comes out and maybe, just maybe, I'll be interested- if the prices have dropped. NV is obviously going to be promoting DLSS 3.0 and RT overdrive to sell these but I'm happy with what I've got for now. My goal was to get to 4K gaming at 60-120 FPS and with DLSS 2.0 that has most certainly been achieved in all but 1 or 2 games and even turning down a setting here and there gets it done, so I'm not concerned about a need to upgrade and really not looking into dropping another $1500-$2000+ into another GPU.
 
I wonder if best buy is going to be the only retailer for the FE cards again.
 
I wonder if DLSS 3 works on ampere/turing. Even if it works with reduced performance increase.
Seeing that FSR 2 brought parity on IQ/performance with DLSS 2 (with nvidia still slightly edging it on both fronts), I expected that nvidia would bring DLSS performance mode at quality mode. That seems to be the case.

The performance numbers on FS are awesome, can't wait to try it.

Now let's see what AMD brings to the table.

BTW nvidia is releasing the card just in time for my Bday ;);):p:p

No, the DLSS 3.0 frame generation tech is 40 series only. Rest of DLSS tech Backwards compatible for older rtx cards.

 
The reason is because DLSS 3.0 uses new logic in the ASIC that the previous generations don't have. It actually uses new functions that are present. Sucks yes, but that's the reason.
 
2-4x faster - in raytracing with DLSS 3
“Up to” 25% faster on anything else.

Can’t wait to see real benches

25% is nothing to sneeze at, but that’s a heck of a price jump for what is otherwise a typical generational increase in performance. I railed against the 2K series for that price jump. I wasn’t better with the announced 3k prices but the joke ended up on all of us as that did what it did. This I guess is just the continuing evolution of the milking of the enthusiasts.
 
I *think* they meant that the Shader Execution Reordering improves performance 25% on everything else, i.e., shader programs in contrast to Ray Tracing. I think that's what that meant, how SER specifically improves performance, they just worded it weird. So I think better wording would be: SER improves Ray Tracing performance 2-4x, and in Shader Programs 25%. I think what they were trying to get across was that SER improves performance on other things besides Ray Tracing. And the reason why is that because in essence, this is similar to Out of Order Execution on CPUs.
 
I *think* they meant that the Shader Execution Reordering improves performance 25% on everything else, i.e., shader programs in contrast to Ray Tracing. I think that's what that meant, how SER specifically improves performance, they just worded it weird. So I think better wording would be: SER improves Ray Tracing performance 2-4x, and in Shader Programs 25%. I think what they were trying to get across was that SER improves performance on other things besides Ray Tracing. And the reason why is that because in essence, this is similar to Out of Order Execution on CPUs.

They weren't really clear on this stuff, so typical corpo vague graphs etc.. The 1min demo of DLSS 3 in Cyberpunk with the 21fps vs 92 fps showed that rt was on but dlss was off except in the lower corner it says rtx is off. The right side is dlss 3 with RT on and rtx on in the lower right corner. Which is it? Who runs with RT on and no dlss? What are the test specs if that is the case? All just pretty light shows in the end.
 
Couldn't be happier, HAPPY-HAPPY, I TELL YA, with our Strix 3090Tis, especially after today's sick news and EVGA's demise! 😂
 
Become a Patron!
Back
Top