NVIDIA DLSS 3 Delivers Huge Gains to Marvel’s Midnight Suns, HITMAN World of Assassination, and Other New DLSS 3 Titles on GeForce RTX 40 Series

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,245
Points
83
NVIDIA has shared an article to remind gamers that its collection of DLSS 3 titles is steadily growing, with some of the latest additions being Marvel's Midnight Suns and HITMAN World of Assassination, IO Interactive's newly launched re-branding of HITMAN 3, both of which received support for the newest iteration of green team's upscaling technology this week. Benchmarks shared by NVIDIA suggest that GeForce RTX 40 Series owners can expect significant performance gains with DLSS 3 enabled, particularly with the flagship GeForce RTX 4090, which can seemingly hit frame rates of over 150 FPS under max settings in 4K with the feature enabled.

See full article...
 
Except you're NOT playing with 4k content with DLSS on. Sigh....
 
DLSS2 is great, DLSS3 is BS snakeoil nonsense. It makes the experience worse. It increases lag not reduces it, and depending on the game it produces weird glitches and artifacts.

The whole point of increasing fps is to get less lag, not more. I'd never use even DLSS2 outside of making an unplayable game playable, if it hits 50fps without DLSS, then it's time to turn it off and go native. And if the game is unplayable without DLSS, DLSS3 is only going to make it worse.
 
I like DLSS 2 and will use it for games that can't hit above 50 FPS consistently in 4K. I find it as an acceptable compromise when using the Quality mode but won't use anything lower.

I've done some testing with DLSS 3 in The Witcher 3, need to get back to Portal, and, for the most part, don't like it. I think NVIDIA should've spent more time developing it before releasing it. I also think they screwed up calling it as such since, strictly speaking, it's not super sampling but frame generation. From my understanding and I could be totally wrong, DLSS 2.xx is doing its thing but DLSS 3 picks that up and generates frames. What I'm also noticing is that it's allowing some developers to go off-the-wall bonkers with ray counts for RT thus forcing a need that doesn't need to be.

It has potential but really needs AI to automatically enable when needed and turn off when not and the lag can be pretty bad when it cuts in during animations or cutscenes. If they could resolve these issues it could be a game changer but, for now, it's not something I'll regularly use outside of testing.
 
It has potential but really needs AI to automatically enable when needed and turn off when not and the lag can be pretty bad when it cuts in during animations or cutscenes. If they could resolve these issues it could be a game changer but, for now, it's not something I'll regularly use outside of testing.

This is an interesting idea to set a target FPS and have DLSS only upscale/duplicate frames when its below target.
But that defeats the purpose of the frame generation feature as its mostly for marketing and not to improve fidelity or performance in a meaningful way. (IMHO)
A friend of mine was all excited that his new 4070TI was 3x faster than the 3090.... except it wasn't.

The fact the this was feed to the site by Nvidia and posted at news feels dirty, even if it was clearly labeled.
 
Last edited:
It has potential but really needs AI to automatically enable when needed and turn off when not and the lag can be pretty bad when it cuts in during animations or cutscenes. If they could resolve these issues it could be a game changer but, for now, it's not something I'll regularly use outside of testing.
And when is it needed? If the game is fast enough without it it is pointless. If the game is not well playable without it then it doesn't help. So what is the point?
 
DLSS2 is great, DLSS3 is BS snakeoil nonsense. It makes the experience worse. It increases lag not reduces it, and depending on the game it produces weird glitches and artifacts.

The whole point of increasing fps is to get less lag, not more. I'd never use even DLSS2 outside of making an unplayable game playable, if it hits 50fps without DLSS, then it's time to turn it off and go native. And if the game is unplayable without DLSS, DLSS3 is only going to make it worse.
Actually, it reduces lag in most cases vs actual 4K.
 
Gotta say marketing is a bitch. getting about twice the framerate at the flip os a switch can certainly make many people fall for it.

A friend of mine skipped the 30980Ti to go for a RTX4080, he ended up getting a RTX4070Ti as it was "more than twice as fast". He's a big FS junkie so getting 100+ FPS vs sub 60 was a major plus.
 
Actually, it reduces lag in most cases vs actual 4K.
No, DLSS2 reduces lag. I think you are being fooled by the fact that DLSS3 also enables DLSS2. And that reduces lag since it renders at a lower resolution increasing FPS legitimately. But DLSS3 only injects made up interpolated frames in between rendered frames.

They must have bought the negative lag feature from stadia to be able to reduce lag with fake frames.
 
No, DLSS2 reduces lag. I think you are being fooled by the fact that DLSS3 also enables DLSS2. And that reduces lag since it renders at a lower resolution increasing FPS legitimately. But DLSS3 only injects made up interpolated frames in between rendered frames.

They must have bought the negative lag feature from stadia to be able to reduce lag with fake frames.
DLSS3 enables nvidia Reflex which reduces latency, which in many cases is lower than native 4K without reflex. It's higher than just DLSS2/reflex though.
 
Well, the last I read: Reflex only “requires” a 3000 series GPU, so it doesn’t rely on DLSS 3 or a 4000 series chip.

So I think it’s fair to say if DLSS 3 increases latency, since Reflex isn’t an exclusive feature of DLSS 3 that reduces it back down — it can be used in other titles on other cards, and just because it’s a tool doesn’t mean that DLSS 3 may still not raise latency.


I mean, RT definitely reduces frame rate: DLSS is one tool that helps bring that frame rate back up, but that doesn’t negate the original statement and most people don’t just say “RT is fine for everyone because DLSS can fix it on a very few select cards”
 
DLSS3 enables nvidia Reflex which reduces latency, which in many cases is lower than native 4K without reflex. It's higher than just DLSS2/reflex though.
That's exactly my point. It is not DLSS3 that reduces latency. In every single situation you are better off with DLSS2 if native is too slow. The only reason DLSS3 exists is so nvidia can make fake marketing claims like the 4070ti being 3x faster than a 3090.
 
And when is it needed? If the game is fast enough without it it is pointless. If the game is not well playable without it then it doesn't help. So what is the point?
Kind of answered you're own question there. "If the game is fast enough", of course, there is no reason to use either if you are hitting your target rates but in situations where say, 30-40 FPS, it could be beneficial.
 
I'm just putting this out there.

If I'm dropping 800 on a CPU, 3-500 on a motherboard, 300 more on Ram, another 350 on high speed storage at 2 tb, A case for 250, Power supply for 350, AIO cooler for another 175. Then add in 1700 for a video card, and 1200 for a 4k monitor....

Oh and lets not forget 50 bucks for a windows license. another 10 bucks on a decent USB key.

KB, 100 bucks, Mouse 50. Headset and or speakers another 150 ish.

Total spend of:

And the only way I can get 100fps+ in a game is to use upscaling to my native resolution?

WTF? Why would I not just buy a console to do it there? (I mean I know I'm not... but you see what I get at.)

PC spend... 5985.00

Console + TV + soundbar, lets see 1400 bucks + or - 200 bucks depending on the TV. Lets go high end and spend 2k on then TV. and 1k on the soundbar setup. 3600 bucks? STILL over 2k cheaper than the PC setup.

Sure you'll need a new Console in 5 years... whatever that price is. And you can transport a lot of your parts from the PC build to new or skip out on cheaper displays meaning less video card needed... but you get the idea. The fact that even a top tier gaming PC can't drive the most demanding games at acceptable framerates without upscaling is just... showing that video cards need advancement not more upscaling.
 
Kind of answered you're own question there. "If the game is fast enough", of course, there is no reason to use either if you are hitting your target rates but in situations where say, 30-40 FPS, it could be beneficial.
If the game is not fast enough you use DLSS2, if it's still not fast enough DLSS3 is only going to make it worse by increasing lag and adding artifacts to the mix.
 
I'm just putting this out there.
You forget that a console is also using upscaling and all kinds of tricks for 4K and you still only get 30-60fps, with low-medium graphics settings compared to PC.

What's killing performance is RT, which seems to be the fad now, many games have it just to get on the bandwagon but show negligible visual improvements when enabled. Without RT effects even older cards can handle native resolution.

Either way I don't see running AI upscaling as a problem if it arguably gets you better visuals than native 4K. To make a stupid car analogy a 15L engine is not automatically better than an 5L one.
 
If the game is not fast enough you use DLSS2, if it's still not fast enough DLSS3 is only going to make it worse by increasing lag and adding artifacts to the mix.
My point is that if DLSS3 evolves it could be another viable tool. I jumped on the DLSS/tensor core train early on and remember how 1.0 barely got the job done and now, even with a 4090 that can do 60+ FPS in a number of games w/o it, I've tested those games with it on/off in 4K and can only just barely notice the difference in quality when off. Ironically when off I've still ended up cranking up AA settings to remove jagged lines and artifacts that became more prominent w/o DLSS. If frame generation reaches a point where its current issues are improved upon similarly then it could be good. I'm not saying it is now, just that it has potential.
 
You forget that a console is also using upscaling and all kinds of tricks for 4K and you still only get 30-60fps, with low-medium graphics settings compared to PC.

What's killing performance is RT, which seems to be the fad now, many games have it just to get on the bandwagon but show negligible visual improvements when enabled. Without RT effects even older cards can handle native resolution.

Either way I don't see running AI upscaling as a problem if it arguably gets you better visuals than native 4K. To make a stupid car analogy a 15L engine is not automatically better than an 5L one.
Right but even with all the high end gear the console is upscaling and so is the PC. ;)
 
Become a Patron!
Back
Top