NVIDIA DLSS 4.5 Upscaling Performance Review & Analysis

magoo

Quasi-regular
Joined
Jul 9, 2019
Messages
451
Reaction score
336
Introduction This year at CES 2026, NVIDIA announced an upgrade to its Super Resolution software dubbed DLSS 4.5. This new software version proposes improved image quality with a 2nd gen Transformer Model, use of new dynamic multiframe generation, and a new 6X Frame Generation mode. All of these improvements are targeted to improve image quality […]

See full article...
 
You know what that is impressive. The ability to maintain image quality with that level of upscaling is surprising to say the least.

I'm looking forward to reading about on demand frame gen as well and how that works out. THAT will be a BEAST to test properly...

I could imagine it in MMO's with large gatherings of players over large spaces that would normally hinder a system. Perhaps that is the key? Regardless that's the next thing In want to see... maintain the high performance upscaling with minimal frame gen to smooth out any FPS Dips for your target resolution.... and that could be amazing.
 
AMD still has a long way to go with FSR, regarding IQ. When FSR came out, it went vs DLSS2, arguably, DLSS2 performance looked better than FSR1 quality mode.

Fast forward to Redstone, vs DLSS4.1 the consensus is again that DLSS4.1 performance mode looks as good or better than FSR4 quality.

I have been using DLSS2 everywhere I can since it was released. Unfortunately, I have a RTX3070Ti so no DLSS4.1 for me because of the performance degradation.
 
Honestly it feels like we're at the point where unless you're specifically pixel peeping, most of this stuff is going to look roughly equivalent to everything else (at least as far as upscaling is concerned, jury is out on FG) in terms of actual gameplay.

I think a far more entertaining test would be a "Pepsi Challenge" type situation where there is a double blind and gamers are shown gameplay of something FSR upscaled, DLSS upscaled, and then some TAA as a control and see who is able to actually in regular moment to moment gameplay tell the difference between one and the other.

Maybe if AMD's marketing team wasn't a bunch of lobotomized fruit flies they might have pulled something like that off already but alas.

First reviewer to pull that off gets a tip of my Cheeto stained fedora.
 
Nice review.

Thank you for including STALKER 2.

Three Quastions:

1.) Any word on what is required for a game to be compatible with the driver override? Is it any title that is already compatible with the first gen transformer model?

2.) Did you compare the image quality to previous versions of DLSS upscaling? I guess I am trying to understand if there is any reason to stick with the older model, or of the newer model is always better. I mean, in some cases there is a performance hit, but is that performance hit worth it in terms of image quality? And conversely, in many cases there is a performance increase. Does that performance increase sacrifice image quality at all?

3.) The sample images are all 1920x1080. It isn't stated explicitly, but I presume these screenshots are all posted unscaled?


In comparing the two screenshots, to my eyes Quality is ever so slightly sharper, but it is really difficult to tell the difference. Whereas I think Performance actually has slightly better aliasing, particularly on the trees in Stalker.

Very interesting stuff. Thanks for the review. This wasn't even on my radar. I have had my head buried in my big build for so long that I haven't been keeping up with the news on stuff like this.
 
Last edited:
Honestly it feels like we're at the point where unless you're specifically pixel peeping, most of this stuff is going to look roughly equivalent to everything else (at least as far as upscaling is concerned, jury is out on FG) in terms of actual gameplay.

I think a far more entertaining test would be a "Pepsi Challenge" type situation where there is a double blind and gamers are shown gameplay of something FSR upscaled, DLSS upscaled, and then some TAA as a control and see who is able to actually in regular moment to moment gameplay tell the difference between one and the other.

Maybe if AMD's marketing team wasn't a bunch of lobotomized fruit flies they might have pulled something like that off already but alas.

First reviewer to pull that off gets a tip of my Cheeto stained fedora.

I recall watching a youtube video of that a while ago comparing FSR vs DLSS2, thing is streaming compression kills subtle differences and sometimes exacerbates more obvious ones.
 
I recall watching a youtube video of that a while ago comparing FSR vs DLSS2, thing is streaming compression kills subtle differences and sometimes exacerbates more obvious ones.

-I mean a live in person test. It could be filmed ofc, but the people being subject to the test would be seeing gameplay rendered in real time a natural.
 
Maybe if AMD's marketing team wasn't a bunch of lobotomized fruit flies they might have pulled something like that off already but alas.
Even more fun would be lying that the FSR screen is using DLSS and vice versa to a large number of unsuspecting gamers to reveal the percentage of brand monkeys in the population :P
 
Even more fun would be lying that the FSR screen is using DLSS and vice versa to a large number of unsuspecting gamers to reveal the percentage of brand monkeys in the population :P
LTT did this exact test a while back. Blind gaming test between native and DLSS, I don't think FSR was in the mix before. Could be an opportunity for another run at it with the next gen stuff.

Personally I want to see the challenge with DLSS with On Demand Frame Gen, compared to AMD's solution and if Intel is still playing ball, Intel's solution.

Because honestly between them all.. I think on demand frame gen is going to be the big one for game play.

If I can play a game right near 100 FPS at 4k, but have some rough dips in specific parts due to... whatever... if frame gen can step in and fill those gaps to keep frames right near 100 FPS... that could be a very good experience.
 
Frame Gen tech has me concerned on two fronts. Maybe my concerns aren't valid, but it's enough to leave me unimpressed and not excited.

First - Fidelity. Is what you are seeing what the developer intended? It may look beautiful, but if it's not what the developer intended, then you may as well be feeding prompts to a GenAI LLM - which can generate some beautiful images, but I wouldn't go so far as to call AI an artist.

Second - Consistency. Does the scaler generate the same image each play through, on all platforms? Does it scale consistently and rationally between various resolutions and scaler multipliers? This touches on the Fidelity point, but takes it a step further and applies it to various conditions.

If the algorithm employed can't accomplish these two things, does it matter what FPS it's getting?
 
Frame Gen tech has me concerned on two fronts. Maybe my concerns aren't valid, but it's enough to leave me unimpressed and not excited.

First - Fidelity. Is what you are seeing what the developer intended? It may look beautiful, but if it's not what the developer intended, then you may as well be feeding prompts to a GenAI LLM - which can generate some beautiful images, but I wouldn't go so far as to call AI an artist.

Second - Consistency. Does the scaler generate the same image each play through, on all platforms? Does it scale consistently and rationally between various resolutions and scaler multipliers? This touches on the Fidelity point, but takes it a step further and applies it to various conditions.

If the algorithm employed can't accomplish these two things, does it matter what FPS it's getting?
I'll agree to this. I've not consumed FrameGen at all. I'm against adding a multiplier of created frames to a base FPS experience. But enough to keep the game smooth I'm less offended by. ;)
 
I'll agree to this. I've not consumed FrameGen at all. I'm against adding a multiplier of created frames to a base FPS experience. But enough to keep the game smooth I'm less offended by. ;)

- Problem with all frame gen solutions is they perform the worst when you need them the most. Low base FPS means input lag and way more artifacting than if you're starting with a high base FPS.

I'm going to give a couple games a shot (cyberpunk for example) with AMD's FG turned on so I can see for myself how it works out boosting from 40FPS with all the RT bells and whistles turned on.
 
LTT did this exact test a while back. Blind gaming test between native and DLSS, I don't think FSR was in the mix before. Could be an opportunity for another run at it with the next gen stuff.

Personally I want to see the challenge with DLSS with On Demand Frame Gen, compared to AMD's solution and if Intel is still playing ball, Intel's solution.

Because honestly between them all.. I think on demand frame gen is going to be the big one for game play.

If I can play a game right near 100 FPS at 4k, but have some rough dips in specific parts due to... whatever... if frame gen can step in and fill those gaps to keep frames right near 100 FPS... that could be a very good experience.

The thing is that Frame gen works best when you already have high fps (preferably over 90 fps), adding frames under say 40fps introduces terrible lag, visible artifacts and jerkiness. I don't see dynamic frame gen helping with that.

I really haven't read much about it, so I may be wrong.
 
The thing is that Frame gen works best when you already have high fps (preferably over 90 fps), adding frames under say 40fps introduces terrible lag, visible artifacts and jerkiness. I don't see dynamic frame gen helping with that.

I really haven't read much about it, so I may be wrong.
That is my understanding as well. Sounds like it will be good for maintaining frames less good for getting to an 'good' overall FPS.

Kind of like resolution scaling offered in games today which I don't use. Frame gen could replace that where the game/drivers can use frame gen to make up for dropped frames on a limited basis...
 
That is my understanding as well. Sounds like it will be good for maintaining frames less good for getting to an 'good' overall FPS.

Kind of like resolution scaling offered in games today which I don't use. Frame gen could replace that where the game/drivers can use frame gen to make up for dropped frames on a limited basis...

I think you can force framegen with AMD even in unsupported games, not that it looks good.

I don't think you can bypass game developer support and do it via drivers, as you need motion vectors and other data for it to look reasonably good.
 
I'll agree to this. I've not consumed FrameGen at all. I'm against adding a multiplier of created frames to a base FPS experience. But enough to keep the game smooth I'm less offended by. ;)
I used it for the second Horizon game - on a 3080 12GB alongside DLSS (so AMDs FrameGen, Nvidia DLSS).

It was ugly, but on that system at 3840x1600, all the alternatives were bad and I wanted more eyecandy lol
 
I've gotten used to using MFG, but in a really specific way. I use MFG x3 combined with either DLSS Quality or DLAA (I always smile a bit more when I can use this) at 4K and then limit FPS to 70-90 (depending on the game). This combination, to me, keeps the GPU from producing ridiculous frames where a lot of idiotic errors occur, but then reduces the workload by quite a bit since most of the titles I play are RT heavy. In addition I also undervolt the 5090, so these settings are the cherry on top for low temps and power draw, and noise levels. This approach can also reduce CPU workload as well.

Even if you're not undervolting, I do recommend trying this combination of settings at a resolution that is a little taxing but would otherwise allow over 100 FPS when using DLSS.

I haven't checked my 1% lows, but I am saying that things are mostly super smooth (with the exception of known games with stuttuering issues, but this approach does greatly reduce them), and FPS stay consistent at the limited rate. It's also nice knowing that every piece of silicon NV has employed is doing what it needs to, and the GPU isn't unnecessarily taking on all the work. Things also look completely natural as opposed to the noticeable fakeness of when reported FPS is 120-300.
 
I've gotten used to using MFG, but in a really specific way. I use MFG x3 combined with either DLSS Quality or DLAA (I always smile a bit more when I can use this) at 4K and then limit FPS to 70-90 (depending on the game).

Doesn't this essentially force you into a native framerate of (divides by four) ~18-23 fps? It might look smooth due to frame gen, but that input lag out to be atrocious.

We are talking like 45-60ms (excluding monitor)? (1/framerate * 1000) I'm not sure I could play my games with that kind of input lag.

Or maybe that doesn't matter in the titles you play?

Generally, while I don't like the concept of frame generation, I'll use if I am getting 60fps minimums without it. That way my worst case input lag (excluding monitor) is 16.67ms.

I'm happy to play at native frames at my minimum of 60fps, but I'm OK with adding 1x frame gen in this case, but I usually don't.

The downside is that my LG C3's max refresh rate is 120hz, so if I enable v-sync on top of G-sync (which I usually do) that means that it won't render above 120fps, and when 1x framegen is involved, that means the native underlying framerate winds up pinned at 60fps.

In most cases, I would prefer a native framerate with good input lag that rises above 60fps, and only dips down to 60fps at the minimums rather than one where the input lag is pretty much fixed at 60fps levels (16.67ms). This means I can play my title at 90fps most of the time, so most of the time I have an input lag of 11.1ms

I'd much rather do a mild upscale (like DLSS 3 Quality in the old CNN model, but maybe even Performance in the new 4.5 Transformer model) than I would enable frame gen, because input lag is the top priority to me, not frame smoothness. On a full scene I find the frame smoothness difference between 60fps and 120fps to be minimal, but I find the input lag difference between those two to be rather significant.
 
Become a Patron!
Back
Top