NVIDIA DLSS 4.5 Upscaling Performance Review & Analysis

magoo

Quasi-regular
Joined
Jul 9, 2019
Messages
451
Reaction score
336
Introduction This year at CES 2026, NVIDIA announced an upgrade to its Super Resolution software dubbed DLSS 4.5. This new software version proposes improved image quality with a 2nd gen Transformer Model, use of new dynamic multiframe generation, and a new 6X Frame Generation mode. All of these improvements are targeted to improve image quality […]

See full article...
 
You know what that is impressive. The ability to maintain image quality with that level of upscaling is surprising to say the least.

I'm looking forward to reading about on demand frame gen as well and how that works out. THAT will be a BEAST to test properly...

I could imagine it in MMO's with large gatherings of players over large spaces that would normally hinder a system. Perhaps that is the key? Regardless that's the next thing In want to see... maintain the high performance upscaling with minimal frame gen to smooth out any FPS Dips for your target resolution.... and that could be amazing.
 
AMD still has a long way to go with FSR, regarding IQ. When FSR came out, it went vs DLSS2, arguably, DLSS2 performance looked better than FSR1 quality mode.

Fast forward to Redstone, vs DLSS4.1 the consensus is again that DLSS4.1 performance mode looks as good or better than FSR4 quality.

I have been using DLSS2 everywhere I can since it was released. Unfortunately, I have a RTX3070Ti so no DLSS4.1 for me because of the performance degradation.
 
Honestly it feels like we're at the point where unless you're specifically pixel peeping, most of this stuff is going to look roughly equivalent to everything else (at least as far as upscaling is concerned, jury is out on FG) in terms of actual gameplay.

I think a far more entertaining test would be a "Pepsi Challenge" type situation where there is a double blind and gamers are shown gameplay of something FSR upscaled, DLSS upscaled, and then some TAA as a control and see who is able to actually in regular moment to moment gameplay tell the difference between one and the other.

Maybe if AMD's marketing team wasn't a bunch of lobotomized fruit flies they might have pulled something like that off already but alas.

First reviewer to pull that off gets a tip of my Cheeto stained fedora.
 
Nice review.

Thank you for including STALKER 2.

Three Quastions:

1.) Any word on what is required for a game to be compatible with the driver override? Is it any title that is already compatible with the first gen transformer model?

2.) Did you compare the image quality to previous versions of DLSS upscaling? I guess I am trying to understand if there is any reason to stick with the older model, or of the newer model is always better. I mean, in some cases there is a performance hit, but is that performance hit worth it in terms of image quality? And conversely, in many cases there is a performance increase. Does that performance increase sacrifice image quality at all?

3.) The sample images are all 1920x1080. It isn't stated explicitly, but I presume these screenshots are all posted unscaled?


In comparing the two screenshots, to my eyes Quality is ever so slightly sharper, but it is really difficult to tell the difference. Whereas I think Performance actually has slightly better aliasing, particularly on the trees in Stalker.

Very interesting stuff. Thanks for the review. This wasn't even on my radar. I have had my head buried in my big build for so long that I haven't been keeping up with the news on stuff like this.
 
Last edited:
Honestly it feels like we're at the point where unless you're specifically pixel peeping, most of this stuff is going to look roughly equivalent to everything else (at least as far as upscaling is concerned, jury is out on FG) in terms of actual gameplay.

I think a far more entertaining test would be a "Pepsi Challenge" type situation where there is a double blind and gamers are shown gameplay of something FSR upscaled, DLSS upscaled, and then some TAA as a control and see who is able to actually in regular moment to moment gameplay tell the difference between one and the other.

Maybe if AMD's marketing team wasn't a bunch of lobotomized fruit flies they might have pulled something like that off already but alas.

First reviewer to pull that off gets a tip of my Cheeto stained fedora.

I recall watching a youtube video of that a while ago comparing FSR vs DLSS2, thing is streaming compression kills subtle differences and sometimes exacerbates more obvious ones.
 
I recall watching a youtube video of that a while ago comparing FSR vs DLSS2, thing is streaming compression kills subtle differences and sometimes exacerbates more obvious ones.

-I mean a live in person test. It could be filmed ofc, but the people being subject to the test would be seeing gameplay rendered in real time a natural.
 
Maybe if AMD's marketing team wasn't a bunch of lobotomized fruit flies they might have pulled something like that off already but alas.
Even more fun would be lying that the FSR screen is using DLSS and vice versa to a large number of unsuspecting gamers to reveal the percentage of brand monkeys in the population :P
 
Even more fun would be lying that the FSR screen is using DLSS and vice versa to a large number of unsuspecting gamers to reveal the percentage of brand monkeys in the population :P
LTT did this exact test a while back. Blind gaming test between native and DLSS, I don't think FSR was in the mix before. Could be an opportunity for another run at it with the next gen stuff.

Personally I want to see the challenge with DLSS with On Demand Frame Gen, compared to AMD's solution and if Intel is still playing ball, Intel's solution.

Because honestly between them all.. I think on demand frame gen is going to be the big one for game play.

If I can play a game right near 100 FPS at 4k, but have some rough dips in specific parts due to... whatever... if frame gen can step in and fill those gaps to keep frames right near 100 FPS... that could be a very good experience.
 
Frame Gen tech has me concerned on two fronts. Maybe my concerns aren't valid, but it's enough to leave me unimpressed and not excited.

First - Fidelity. Is what you are seeing what the developer intended? It may look beautiful, but if it's not what the developer intended, then you may as well be feeding prompts to a GenAI LLM - which can generate some beautiful images, but I wouldn't go so far as to call AI an artist.

Second - Consistency. Does the scaler generate the same image each play through, on all platforms? Does it scale consistently and rationally between various resolutions and scaler multipliers? This touches on the Fidelity point, but takes it a step further and applies it to various conditions.

If the algorithm employed can't accomplish these two things, does it matter what FPS it's getting?
 
Frame Gen tech has me concerned on two fronts. Maybe my concerns aren't valid, but it's enough to leave me unimpressed and not excited.

First - Fidelity. Is what you are seeing what the developer intended? It may look beautiful, but if it's not what the developer intended, then you may as well be feeding prompts to a GenAI LLM - which can generate some beautiful images, but I wouldn't go so far as to call AI an artist.

Second - Consistency. Does the scaler generate the same image each play through, on all platforms? Does it scale consistently and rationally between various resolutions and scaler multipliers? This touches on the Fidelity point, but takes it a step further and applies it to various conditions.

If the algorithm employed can't accomplish these two things, does it matter what FPS it's getting?
I'll agree to this. I've not consumed FrameGen at all. I'm against adding a multiplier of created frames to a base FPS experience. But enough to keep the game smooth I'm less offended by. ;)
 
I'll agree to this. I've not consumed FrameGen at all. I'm against adding a multiplier of created frames to a base FPS experience. But enough to keep the game smooth I'm less offended by. ;)

- Problem with all frame gen solutions is they perform the worst when you need them the most. Low base FPS means input lag and way more artifacting than if you're starting with a high base FPS.

I'm going to give a couple games a shot (cyberpunk for example) with AMD's FG turned on so I can see for myself how it works out boosting from 40FPS with all the RT bells and whistles turned on.
 
LTT did this exact test a while back. Blind gaming test between native and DLSS, I don't think FSR was in the mix before. Could be an opportunity for another run at it with the next gen stuff.

Personally I want to see the challenge with DLSS with On Demand Frame Gen, compared to AMD's solution and if Intel is still playing ball, Intel's solution.

Because honestly between them all.. I think on demand frame gen is going to be the big one for game play.

If I can play a game right near 100 FPS at 4k, but have some rough dips in specific parts due to... whatever... if frame gen can step in and fill those gaps to keep frames right near 100 FPS... that could be a very good experience.

The thing is that Frame gen works best when you already have high fps (preferably over 90 fps), adding frames under say 40fps introduces terrible lag, visible artifacts and jerkiness. I don't see dynamic frame gen helping with that.

I really haven't read much about it, so I may be wrong.
 
The thing is that Frame gen works best when you already have high fps (preferably over 90 fps), adding frames under say 40fps introduces terrible lag, visible artifacts and jerkiness. I don't see dynamic frame gen helping with that.

I really haven't read much about it, so I may be wrong.
That is my understanding as well. Sounds like it will be good for maintaining frames less good for getting to an 'good' overall FPS.

Kind of like resolution scaling offered in games today which I don't use. Frame gen could replace that where the game/drivers can use frame gen to make up for dropped frames on a limited basis...
 
That is my understanding as well. Sounds like it will be good for maintaining frames less good for getting to an 'good' overall FPS.

Kind of like resolution scaling offered in games today which I don't use. Frame gen could replace that where the game/drivers can use frame gen to make up for dropped frames on a limited basis...

I think you can force framegen with AMD even in unsupported games, not that it looks good.

I don't think you can bypass game developer support and do it via drivers, as you need motion vectors and other data for it to look reasonably good.
 
I'll agree to this. I've not consumed FrameGen at all. I'm against adding a multiplier of created frames to a base FPS experience. But enough to keep the game smooth I'm less offended by. ;)
I used it for the second Horizon game - on a 3080 12GB alongside DLSS (so AMDs FrameGen, Nvidia DLSS).

It was ugly, but on that system at 3840x1600, all the alternatives were bad and I wanted more eyecandy lol
 
Become a Patron!
Back
Top