ASUS TUF Gaming GeForce RTX 4070 Ti 12GB OC Edition Video Card Review

Brent_Justice

Administrator
Staff member
Joined
Apr 23, 2019
Messages
773
Points
93
The GeForce RTX 4070 Ti is Here In today’s video card review, we take a look at the brand-new ASUS TUF Gaming GeForce RTX 4070 Ti 12GB GDDR6X OC Edition video card, based on NVIDIA’s new GeForce RTX 4070 Ti. The GeForce RTX 40 Series from NVIDIA was announced on September 20th, 2022 but the […]

See full article...
 
Sort of want, but will wait until under 500. 1080ti still gets job done.
 
Ok, so this is only tangential to the topic... but at what point do you get to call it either a monopoly or price fixing?

Now, I do not think that AMD and nVidia are colluding to raise consumer prices. I think nVidia is doing it because they can get away with it, and AMD is just playing along with "just slightly less than" pricing because their shareholders would oust them outright if they didn't.

There is no government limit on the margin a company can ask, or the profit they can pull in. That's kinda how capitalism works. But I sure hate this feeling that we are just getting utterly taken advantage of. I guess it's just a hobby, I don't have to play along - I just hate seeing it happen before my eyes.
 
Ok, so this is only tangential to the topic... but at what point do you get to call it either a monopoly or price fixing?

Now, I do not think that AMD and nVidia are colluding to raise consumer prices. I think nVidia is doing it because they can get away with it, and AMD is just playing along with "just slightly less than" pricing because their shareholders would oust them outright if they didn't.

There is no government limit on the margin a company can ask, or the profit they can pull in. That's kinda how capitalism works. But I sure hate this feeling that we are just getting utterly taken advantage of. I guess it's just a hobby, I don't have to play along - I just hate seeing it happen before my eyes.
Nvidia has enormous pricing power because of RTX & DLSS3

At the premium price points ($500 plus) people most likely go by added features rather than outright raster performance

What is surprising is that: even in below $500 prev gen GPUs, higher price Ampere cards outsell equivalent but much cheaper RDNA 2 cards

If more people purchase Navi 22 cards (RX 6750 XT, 6700 XT, RX 6700) instead of Ampere, than Nvidia might get the message !!
 
Nvidia has enormous pricing power because of RTX & DLSS3 and NVENC and VR (apparently)
DLSS3 itself isn't a real selling point. It exists, and it has a purpose - but not for most gaming scenarios. DLSS2 is still the main DLSS technology and remains more competitive overall, especially in situations where RT is used with heavy loads overall.
 
DLSS3 itself isn't a real selling point. It exists, and it has a purpose - but not for most gaming scenarios. DLSS2 is still the main DLSS technology and remains more competitive overall, especially in situations where RT is used with heavy loads overall.
If nvidia pushes DLSS3 as hard as they pushed DLSS, we'll see support in most upcoming games.

Besides, 100% performance increase does sound like a big selling point.
 
Well, it seems the card lies between the RTX3090 and 3090Ti, but men the thing gets bandwidh starved at 4k

I'm really impressed just becauese looking at the specs I really didn't think it would do that good.
 
Great review!

I figured Nvidias "faster than 3090ti" claims were going to be all fluff, but it does at least get close most of the time, and actually succeeds in others.

Still, not quite sure I'd be happy with the value proposition $799 is a lot for a mid range card.

If I could make a request, if it isn't too much of a main in the ***, Minimum/1%/0.1% framerates are very relevant and would add a lot of information to your reviews. I'd even argue that minimum frames are MORE relevant that average frames.
 
If nvidia pushes DLSS3 as hard as they pushed DLSS, we'll see support in most upcoming games.

Besides, 100% performance increase does sound like a big selling point.
Sure?

DLSS 1.0 was pretty bad - but showed the promise of intelligent upscaling.

DLSS 2.0 has more or less delivered on that promise. My opinion, of course; I find DLSS 2.0 to be more than close enough to justify any perceived degradation in visuals. Granted I have no problem using FSR / FSR 2.0 where exclusively available, whether I'm gaming on an AMD GPU or that's just what the game in question supports.

DLSS 3.0 is 'frame doubling'. This works great for pre-recorded media (if you like the effect itself), but for real-time applications it means that there's a definite insertion of additional input latency attached with usage. So, for DLSS 3.0 to even be useful in a particular situation, performance already has to be high enough for real-time interaction. Whereas DLSS 2.0 taking you from an arbitrary 20FPS to 35FPS, or 60FPS to 90FPS and so on can significantly impact the quality of the experience, DLSS 3.0 (that is, the 'frame generation' part of it) needs you to be running at 60FPS+ in the first place.

And there's a place for that - many games that aren't as user-response dependent can benefit, thinking anything turn-based or perhaps RPG and adventure-style games. Some simulators as well.

But for anything where optimizing for input latency is already a concern, DLSS 3.0 is a non-starter.

Still, not quite sure I'd be happy with the value proposition $799 is a lot for a mid range card.
No one should be! But it's still the cheapest entry point for the level of technology Nvidia is bringing to the table.
 
BTW seems to me the cards are unnecessarily big. It seems the PCB is about 2/3 the size of the card.
 
BTW seems to me the cards are unnecessarily big. It seems the PCB is about 2/3 the size of the card.

I wonder if they reused parts from its bigger brothers to save R&D and inventory costs.

It's a tradeoff. It costs more per unit, but it saves having to develop new parts.
 
Sure?

DLSS 1.0 was pretty bad - but showed the promise of intelligent upscaling.

DLSS 2.0 has more or less delivered on that promise. My opinion, of course; I find DLSS 2.0 to be more than close enough to justify any perceived degradation in visuals. Granted I have no problem using FSR / FSR 2.0 where exclusively available, whether I'm gaming on an AMD GPU or that's just what the game in question supports.

DLSS 3.0 is 'frame doubling'. This works great for pre-recorded media (if you like the effect itself), but for real-time applications it means that there's a definite insertion of additional input latency attached with usage. So, for DLSS 3.0 to even be useful in a particular situation, performance already has to be high enough for real-time interaction. Whereas DLSS 2.0 taking you from an arbitrary 20FPS to 35FPS, or 60FPS to 90FPS and so on can significantly impact the quality of the experience, DLSS 3.0 (that is, the 'frame generation' part of it) needs you to be running at 60FPS+ in the first place.

And there's a place for that - many games that aren't as user-response dependent can benefit, thinking anything turn-based or perhaps RPG and adventure-style games. Some simulators as well.

But for anything where optimizing for input latency is already a concern, DLSS 3.0 is a non-starter.


No one should be! But it's still the cheapest entry point for the level of technology Nvidia is bringing to the table.
Yeah, I'm sure. There are already about 50+ games announced that will support DLSS3. If someone know how to shove new technology down your throat, that's nvidia. BTW thanks to nvidia reflex, latency is actually lower than without DLLS3. Not saying its going to be great, but by now DLSS already gained high praise and is considered a must have for many gamers, DLSS3 at least deserves a chance.

Anyway AMD is also having its own frame interpolation/frame creation technology coming soon. We'll see.
 
DLSS 2.0 has more or less delivered on that promise. My opinion, of course; I find DLSS 2.0 to be more than close enough to justify any perceived degradation in visuals. Granted I have no problem using FSR / FSR 2.0 where exclusively available, whether I'm gaming on an AMD GPU or that's just what the game in question supports.

I haven't owned a Nvidia GPU newer than my 2016 Pascal Titan X, so I ahve not personally played with DLSS yet.

Can all titles that support DLSS3 also run DLSS2? In other words is it backwards compatible or an option?

Because I too tend to think - from what I have read and seen, through not through personal experience) that DLSS2 was the pinnacle of the tech.
 
DLSS 3 is Frame Generation. I wish they hadn't called it DLSS 3, but some other new name. Calling it Frame Generation is fine, but attaching the word DLSS 3 to it is confusing.

If you are using Frame Generation, you are using Frame Generation. Frame Generation is independent of DLSS option.

However, with the way it works, if the game supports Frame Generation, you'll also be able to turn on DLSS if you chose not to use Frame Generation.

So if you see Frame Generation in the game, you'll also see the DLSS slider/or drop down box and can just use DLSS, without the Frame Generation turned on.
 
DLSS 3 is Frame Generation. I wish they hadn't called it DLSS 3, but some other new name. Calling it Frame Generation is fine, but attaching the word DLSS 3 to it is confusing.

If you are using Frame Generation, you are using Frame Generation. Frame Generation is independent of DLSS option.

However, with the way it works, if the game supports Frame Generation, you'll also be able to turn on DLSS if you chose not to use Frame Generation.

So if you see Frame Generation in the game, you'll also see the DLSS slider/or drop down box and can just use DLSS, without the Frame Generation turned on.
I tested it in Witcher 3. Sort of cool but major latency when the game would switch from to any cutscenes and then back but also certain animations could trigger delays as well. I ended up turning it off but leaving DLSS on. It was cool when the delays were not happening though. Saw it bump FPS to upwards of 118+ in 4K with the RT Ultra settings. As is my rig was holding 55~80 FPS so it's not a big loss for a 4090/5800X3D combo.
 
I haven't owned a Nvidia GPU newer than my 2016 Pascal Titan X, so I ahve not personally played with DLSS yet.

Can all titles that support DLSS3 also run DLSS2? In other words is it backwards compatible or an option?

Because I too tend to think - from what I have read and seen, through not through personal experience) that DLSS2 was the pinnacle of the tech.
Actually, its a little bit complicated.

You see DLSS3 in composed of the upscaling part (DLSS2) and the AI frame generation part. These are independent of each other. In other words, you can enable DLSS2 as any other DLSS2 title OR you can enable just Frame generation without DLSS. Enabling both effectively gives you DLSS3.0.

In fact it's possible to use nvidia frame generation with either FSR or XeSS if the game supports them.

BTW AFAIK we are at DLSS version 2.3.x, I wonder what will happen when DLSS2 reaches version 2.9, what happens then? :rolleyes: :rolleyes:
 
You see DLSS3 in composed of the upscaling part (DLSS2) and the AI frame generation part. These are independent of each other. In other words, you can enable DLSS2 as any other DLSS2 title OR you can enable just Frame generation without DLSS. Enabling both effectively gives you DLSS3.0.

Ah. So that's good. So should you need it, you can still enable DLSS upscaling (if supported) without the frame generation.

BTW AFAIK we are at DLSS version 2.3.x, I wonder what will happen when DLSS2 reaches version 2.9, what happens then? :rolleyes: :rolleyes:

Look no further than to the Linux Kernel for inspiration.

It could be DLSS 2.10, 2.11, 2.12, etc. :p
 
Become a Patron!
Back
Top