DF Nioh2 DLSS analysis

Auer

FPS Regular
Joined
Jul 2, 2019
Messages
1,043
Points
113

Not my kind of game but interesting analysis.
 
Can we get a TLDR for those of us that are video opposed?
 
So, if I am understanding this correctly, DLSS is better than TAA. Ok. How is it better than native though?

The entire reason you have AA is to eliminate artifacts from lines at angles where pixelization occurs. More pixels is the most obvious cure and brute force solution for that problem, AA is a visual work around.

I can totally get that DLSS is a superior form of anti aliasing. I can’t get how it’s visually superior to higher resolution.
 
So, if I am understanding this correctly, DLSS is better than TAA. Ok. How is it better than native though?

The entire reason you have AA is to eliminate artifacts from lines at angles where pixelization occurs. More pixels is the most obvious cure and brute force solution for that problem, AA is a visual work around.

I can totally get that DLSS is a superior form of anti aliasing. I can’t get how it’s visually superior to higher resolution.
Native 4K without any AA as in nioh2 has some jaggies still that DLSS can fix and do it at 1440p, then scale it up to 4k and look better than native.
Game depending, it does work, and very well. And it seems that we keep getting a few more games here and there these days.

Playing Death Stranding with a RTX2070 at 4k was pretty cool for me, personally. And a few other games as well.
 
So, if I am understanding this correctly, DLSS is better than TAA. Ok. How is it better than native though?

The entire reason you have AA is to eliminate artifacts from lines at angles where pixelization occurs. More pixels is the most obvious cure and brute force solution for that problem, AA is a visual work around.

I can totally get that DLSS is a superior form of anti aliasing. I can’t get how it’s visually superior to higher resolution.

You may want to read the article I think it explains it pretty well.

But to sum it up, its magic!!!, witchcraft!!!

Just a note. Apparently DLSS works best at 4K and up since it has more input to begin with.
 
Yeah the rule of thumb with DLSS is its got to be at 4K to really be worth anything.

The convoluted myth with AA and 4K is that it was once believed higher resolutions would be the cure. However, if the source renders are still at 1080p or somesuch you're still going to have aliasing at those higher resolutions. Anyone who's played an older game at 4K will experience it. Some things scale better than others but those weaknesses are definitely apparent.

On the other hand, if all the textures, models, etc. are rendered at higher resolutions then most cards can't handle it. People will complain about various optimizations being the culprit, and they can be, but it takes powerful equipment to render that kind of quality. Odds are we'll never reach that magic top-end res where imperfections just go away which means some kind of magic ju-ju will always be needed somewhere.
 
Ok I did read the article.

what I’m coming away with is DLSS looks better than 4K w/o AA... which we are calling “native”, and it yields higher performance than 4K since it’s upscaled.

Hmm. So if I boil it all down what we are saying is a game looks better with AA than without? And the other take away is that rendering at lower resolution is higher performance than higher resolution?

I don’t get what DLSS is bringing to the table really - those two takeaways seem like they were pretty well accepted before ”deep learning” became a buzzword.
 
Ok I did read the article.

what I’m coming away with is DLSS looks better than 4K w/o AA... which we are calling “native”, and it yields higher performance than 4K since it’s upscaled.

Hmm. So if I boil it all down what we are saying is a game looks better with AA than without? And the other take away is that rendering at lower resolution is higher performance than higher resolution?

I don’t get what DLSS is bringing to the table really - those two takeaways seem like they were pretty well accepted before ”deep learning” became a buzzword.
I think on some level it's like those texture mods some people make using machine learning to upscale original assets but with DLSS it's it in real-time using whatever computer model that NV and the developers have come up with. DLSS 2.0 allows a much greater range now. I've done a lot of testing with it on and off at various resolutions all the way down to 1080p with a handful of games. At that level, it's pretty awful. At 1440p it really depends but I still don't recommend it. At 4K it can depend. If I'm close enough to my LG C9 or Z9D I can absolutely see a decrease in IQ. However, sit back a bit and use it, and I can enjoy the other perks like higher FPS. Turning AA on or off has its ups and downs but to me the gains with on, and maxed, are not usually worth the hits.

At one point there's bound to be a non-proprietary solution to do the same. We've been hearing about it for a while and everyone in the industry is constantly tossing their brand of magic around. For now, DLSS is almost like NV's way of gaining some of that lost SLI performance w/o having all those headaches. Tensor cores/ RT cores, are to me basically a way for them to cram more processors on the board without going full-on mGPU.
 
My testing has been limited to a 4k 32" LG monitor, and what games I can play on it.
It's great to turn on DLSS and get great IQ and decent FPS. Wish a lot more games had the option.
Simply put, to me, it added a lot of value to my now aging RTX 2070.
 
My testing has been limited to a 4k 32" LG monitor, and what games I can play on it.
It's great to turn on DLSS and get great IQ and decent FPS. Wish a lot more games had the option.
Simply put, to me, it added a lot of value to my now aging RTX 2070.
Often noted as the best value for that gen too!
 
Become a Patron!
Back
Top