NVIDIA Announces GeForce RTX 4070 Ti for $799, “Faster than 3090 Ti”

I believe the answer is... "It Depends."
Indeed. It depends is the main answer - main thing that I've been buying the past few years has been GPUs as they simply were not getting sampled except to the super big guys.

We have been samples every AMD/Nvidia launch GPU except the 3090 and 3090 Ti, both of which had a much smaller press distribution.

We have some samples 40/7900 series gpus in the review pipeline. I'm hoping to not buy any from this generation.
 
So comparing the 4070ti to the 3090ti:

On average, the 4070ti appears to use about 250w less than the 3090ti
On average, in ray tracing, the 4070ti appears to be about the same performance as the 3090ti with some variation based on game
On average, at 4k, the 4070ti appears to be about 5% slower than the 3090ti with some variation based on game
On average, at 1440, the 4070ti appears to be about 5% faster than the 3090ti with some variation based on game

So, more or less, the 4070ti is roughly a 3090ti. For 60% less cost as msrp and like half the power draw.

Sound about right?
 
So comparing the 4070ti to the 3090ti:

On average, the 4070ti appears to use about 250w less than the 3090ti
On average, in ray tracing, the 4070ti appears to be about the same performance as the 3090ti with some variation based on game
On average, at 4k, the 4070ti appears to be about 5% slower than the 3090ti with some variation based on game
On average, at 1440, the 4070ti appears to be about 5% faster than the 3090ti with some variation based on game

So, more or less, the 4070ti is roughly a 3090ti. For 60% less cost as msrp and like half the power draw.

Sound about right?
I'd say spot on more or less but good luck on that MSRP. Already seeing prices upwards of $1500 but I did just see this:
1672945090214.png
 
Got a bunch at $799 at microcenter fwiw
I think that'll be the big trick. Get them off the shelves and avoid @$$hat scalpers online. It sure seemed from the stories I read last week and before that, some retailers did have inventory on hand to put out so there's hope for some folks.
 
So comparing the 4070ti to the 3090ti:

On average, the 4070ti appears to use about 250w less than the 3090ti
On average, in ray tracing, the 4070ti appears to be about the same performance as the 3090ti with some variation based on game
On average, at 4k, the 4070ti appears to be about 5% slower than the 3090ti with some variation based on game
On average, at 1440, the 4070ti appears to be about 5% faster than the 3090ti with some variation based on game

So, more or less, the 4070ti is roughly a 3090ti. For 60% less cost as msrp and like half the power draw.

Sound about right?
I would caveat this heavily:

I agree.

But I don't go in-so-far as to agree that it makes the 4070 an overall good value. The 3090Ti was also very ridiculously priced compared to previous generations. When you are comparing bad to something worse, that doesn't really make the bad any better.
 
Exactly. I was thinking that DLSS3 is just nVidia's version of Samsung AutoMotion/LG TruMotion/whatever-the-f*ck names other HDTV manufacturers have for motion interpolation.
I'm not aware of it being anything else, really - there's presumably some tuning for the implementation being for real-time rendering as opposed to prerecorded media, but the basis is the same, as is the general effect.
 
I would caveat this heavily:

I agree.

But I don't go in-so-far as to agree that it makes the 4070 an overall good value. The 3090Ti was also very ridiculously priced compared to previous generations. When you are comparing bad to something worse, that doesn't really make the bad any better.
What are you looking for value wise? Surely it isn’t 3090ti performance at 499?
 
I would caveat this heavily:

I agree.

But I don't go in-so-far as to agree that it makes the 4070 an overall good value. The 3090Ti was also very ridiculously priced compared to previous generations. When you are comparing bad to something worse, that doesn't really make the bad any better.

Yeah, I would say I agree from a pragmatic sense if you need to buy something now.

I do not agree that this means there shouldn't be some level of moral outrage over pricing and Nvidia's market manipulation.
 
What are you looking for value wise? Surely it isn’t 3090ti performance at 499?
s4eamxvl.jpg
 
Well, sure, I’d love to see them just price the top end performance of the previous generation at 199 for the new generation, but we have never seen something like a new Gen xx50 card match performance of last Gen Titan.

What I’m getting at is this isn’t the old Intel days where Ivy Bridge was a minimal improvement over Sandy bridge.
 
What are you looking for value wise? Surely it isn’t 3090ti performance at 499?
Why shouldn't it be? A 3090Ti may be a stretch, but looking back historically, die size has been a decent rough indicator of what the card will cost to make, and what it will be named and sold for.

Before RTX, price correlated pretty well with die size (as well as naming convention). There were some outliers (Fermi, mainly), but all the way until we hit RTX, it was fairly consistent. I show Ti editions where available, the slope between price and die area looks a little bit more consistent if you just look at base editions, but since they named this one Ti, I decided to stick with Ti nomenclature where it was available.

4070 Ti : 295mm, $899
3070 Ti : 392mm, $599
2070 : 445mm, $499
1070 Ti : 314mm, $399
970 Ti : 398mm, $329
770 : 294mm, $399
670 : 294mm, $399
570 : 520mm, $349
470 : 529mm, $349
(data from TechPowerUp Database)

Now, die size isn't the same thing as transistor count - for a given die size, transistor count goes up as process node improves. And a wafer on an improved process node will cost more than a lesser process node. And inflation happens, sure. And there are other components and costs: coolers, VRAM, etc. and nVidia does need to make a return on R&D. But if you pick any other die name/size and chart it over the generations, you see it's about the same - fairly in line: die size had been a decent indicator. That holds for both AMD and nVidia, and it's similarly applicable in the CPU realm as well (although those dies don't vary nearly as much as GPU).

The GTX 470 was released in Mar, 2010. I don't think we've seen 157% increase in price, even if you aggregate all those factors together, in the past 13 years. The price held reasonably for a long time, even accounting for inflation. and that is even discounting that the Fermi chip was a massive chip and they were able to keep the price that low -- which I see as an indicator of how far nVidia can afford to shave the margin when they are willing to.

So, to me: either nVidia is getting really, really greedy with their pricing, or their yields are just so abysmal they have to mark this stuff up by a crazy amount, or the price of "everything else" apart from the die has just absolutely skyrocketed. And I'm betting on the former.
 
Become a Patron!
Back
Top