Starfield Runs Better with NVIDIA DLSS, Benchmarks for AMD-Sponsored Game Show

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,660
Points
113
Should Starfield have launched with NVIDIA DLSS? Yeah, probably, as benchmarks for the game's beta update have begun showing up online, revealing what is apparently the better choice for those who need a performance boost in Bethesda's newish sci-fi RPG (hint: almost everyone).

See full article...
 
There was never gonna be any surprise that DLSS runs better, especially since community-provided DLSS mods/DLLs were already proving that. A very clear "no duh" or "no sh1t Sherlock" situation for Starfield. But that's how it always goes with DLSS. Provides the best image quality and provides the best performance compared to FSR and XeSS.
 
Last edited:
NEWS: Competitors software technology runs better on competitors hardware!
In other news: Competitors software doesn't run at all on other hardware!
Elsewhere: Water is wet!

Seems like if I was a developer I would first implement features that can help all users and implement hardware specific items at a later time.

I write this as an NVidia user. I had no problem with them doing FSR first because it make sense for the studio to do, even if it wasn't AMD sponsored.
The article comes across like it was written by NVidia or with an agenda.
Did FSR do better than Native - yes, was it poor looking - no, did it work for everyone - yes.
Hard to argue with that.
 
Last edited:
Honestly I think the better comparison to come is FSR3 to DLSS with framegen. I'll be sure to tinker with FSR3.

Though to be honest... I've seen nary a hitch playing at 1440p with my setup and everything maxed no FSR.

Remember when the 10 series came out... it was so much faster they introduced a technology to run the game at a HIGHER resolution and scale it down to increase lower resolution fidelity?

Seems like a distant memory now.
 
I'm not a fan of DLSS. Makes everything look like dog poo.
And FSR looks even worse. Sometimes way worse. But I have seen implementations of both that look pretty good too, and same with XeSS.

I would much rather use DLAA. Good gawd, the image quality provided by DLAA is absolutely incredible. AMD's answer to this was announced as part of FSR3, and I can't wait to check it out. Not expecting results as good as DLAA, but should still be an improvement over all the shader-based AA methods like TAA. I was a fan of FXAA back in the day despite its shortcoming, but SMAA was always my favorite shader-based AA. I never cared for TAA. Now f*ck all of them, cuz DLAA is here.

But yeah I use DLAA cuz as a 3090 owner (which I got for free) that games at 1440p, I often don't need the upscaling, or the performance boost that comes with it. I either run games at native 1440p, or use nVidia DSR to run games at 4K downscaled to 1440p (or if a game provides in-game functions to do this, I use that instead of DSR).

Seems like if I was a developer I would first implement features that can help all users and implement hardware specific items at a later time.

I write this as an NVidia user. I had no problem with them doing FSR first because it make sense for the studio to do, even if it wasn't AMD sponsored.
While I normally would agree with this, and tend to prefer sh1t that isn't locked to specific hardware (and I dislike proprietary tech), it's also true that the vast the majority of PC gamers use nVidia hardware. So I think it makes sense to implement features that most people gaming on PC will be able to use. On the other hand, literally everyone can use FSR and XeSS, so I guess that makes those users the wider audience. Honestly though there's no excuse not to ship with all 3 out the gate. Other devs do it, but apparently Bethesda couldn't be bothered, cuz they are lazy (big surprise).
 
I like my 7900xtx and I liked my XFX rx6900xt black edition before that, it was a very nice upgrade from my 1080ti.

I have used both NV and AMD GPU's both discreet and mobile versions. I think both are awesome and do a great job.

That said, AMD does seem to have some catching up to do with the whole AI and RT dedicated hardware. NV has some catch up to do with pure raster performance - though with the AI/DLSS bypassing the need for 4k rasterizing not as much.

Or in other words I agree with the above sentiment that making a stink about DLSS doing better is much like saying water is wet.

What is it with Bethesda and buggy games? Pretty games, revolutionary games, awesome and incredible games, but always -almost like a universal constant- buggy games.

Maybe in math we should add a new constant β to go with e and π?
 
Dedicated hardware/Tensor cores can process more complex upscaling/reconstruction faster than shaders

Who knew :p
Well, actually, from what I've seen depending on the game, FSR wins some, DLSS wins some and are mostly tied. DLSS does have better IQ though.
 
What is it with Bethesda and buggy games? Pretty games, revolutionary games, awesome and incredible games, but always -almost like a universal constant- buggy games.
I used to work for a freelance QA software/game testing company, and we were located nearby to Zenimax and Bethesda (also my parents live in that area). They were a constant client of ours. In college I also had friends who worked directly for Bethesda Softworks' QA department. You'll be surprised by how many bugs the testers DO find (and in time for launch too), but upper management often ignores a good amount of the bug reports, and decides not to really waste time or effort on fixing any of them (some get pushed to later, a good amount gets pushed to "never"). Not to mention the old-@ss engine with roots in Gamebryo bogged down in decades of patchwork code to try to keep it current-enough to keep making modern games on. On top of all this, the big console manufacturers Sony and Microsoft would often waive issues during the certification process (including automatic cert-failing issues) to help Bethesda games keep their launch dates, cuz Bethesda games pulled in a f*ckton of money on consoles as well as PC. Imagine how things will be now that Microsoft owns Bethesda. They'll probably let all kinds of sh1t slip through on purpose just to make sure the gravy train arrives on time. At the end of the day, Bethesda Softworks just plain didn't care when it came to QA. And of course with that attitude came a lot of cutbacks and staff layoffs in the QA department over the years. Good discounts at the company store though.
 
While I normally would agree with this, and tend to prefer sh1t that isn't locked to specific hardware (and I dislike proprietary tech), it's also true that the vast the majority of PC gamers use nVidia hardware. So I think it makes sense to implement features that most people gaming on PC will be able to use. On the other hand, literally everyone can use FSR and XeSS, so I guess that makes those users the wider audience. Honestly though there's no excuse not to ship with all 3 out the gate. Other devs do it, but apparently Bethesda couldn't be bothered, cuz they are lazy (big surprise).
I would guess the consoles use FSR as well. But maybe not.
 
I'd rather not have to use any kind of upscaling but at 4K, you often have no choice and to me, its a visually better alternative than reducing the game's visual quality by disabling features.
Thanks to DLSS my 4080 runs anything my previous 4090 ran.
 
Consoles providing satisfactory or even good experiences for gamers on a 4k display is surprising... IF you don't recall the days of minimum 30fps being good for gaming. ;)

PC gamers... especially us are FAR more demanding of the upper tier of gaming experiences than any console gamer would be. Part of the reason we pitch a fit at FPS locked titles ported from consoles.
 
Become a Patron!
Back
Top