AMD Announces It Is Bethesda’s Exclusive PC Partner for Starfield, Killing All Hopes of NVIDIA DLSS 3 and Frame Generation Implementation: “We ...

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,302
Points
83
AMD has shared a video that can confirm it is the official PC partner for Starfield. This seems to imply that the latest game from Todd Howard and Bethesda Game Studios isn't getting an official implementation of NVIDIA's DLSS technologies (e.g., DLSS 3 Super Resolution, Reflex, and Frame Generation) any time soon, although Jack Huynh, SVP and GM of AMD's Computing and Graphics Group, has promised that the collaboration will "unlock the full potential of Starfield." Starfield is out on September 6, 2023, with AMD FidelityFX Super Resolution 2.

See full article...
 
Cool I guess. Interesting how Nvidia doesn't like the shoe on the other foot.
 
Last edited:
Cool I guess. Interesting how Nvidia doesn't like the shoe on the other foot.
It's not NVIDIA that both AMD and Bethesda will piss off if it has FSR exclusivity and lacks next gen features

It's the near 90% of the dGPU market that isn't AMD
 
It's not NVIDIA that both AMD and Bethesda will piss off if it has FSR exclusivity and lacks next gen features

It's the near 90% of the dGPU market that isn't AMD
Yep..... Kind of like exclusives..
 
It's not NVIDIA that both AMD and Bethesda will piss off if it has FSR exclusivity and lacks next gen features

It's the near 90% of the dGPU market that isn't AMD
Except that almost every single one of those GPUs can run FSR but only a fraction, and likely a small fraction at that, can run DLSS. And beyond that, the console market dwarfs the GPU market and guess what happens to be powering those consoles?

It's amazing how much butthurt there is over nVidia being left out in the upscaling race with its proprietary upscaling. It's also quite funny that nVidia fans seem to think nVidia is the whole of the market when it's not even remotely true.

I guess DLSS is becoming a worse and worse selling point...
 
It's also quite funny that nVidia fans seem to think nVidia is the whole of the market when it's not even remotely true.
Surprisingly almost 40% is RTX2060 and newer NVIDIA GPUs. So not a small fraction.

Just because AMD is the underdog exclusivity deals still aren't cool.
 
Last edited:
Surprisingly almost 40% is RTX2060 and newer NVIDIA GPUs. So not a small fraction.

Just because AMD is the underdog exclusivity deals still aren't cool.
Just curious, where do you get that figure?

If I look at Steam Hardware survey, I see around 25% (just eyeballin' it) of the total marketshare might be DLSS compatible in some various version, and only about 1% supports the latest DLSS 3.

I mean, 25% isn't insignificant, but it's still a long way from every nVidia card out there. This chart also goes to show how far behind AMD is - the ancient 580 is still their top identified card out there (although it was a decent one available for a really good price for a long time)

Untitled.jpg

 
NVIDIA's history of shenanigans doesn't make it easy to remain objective on such matters, but if AMD is prohibiting developers from including competitor technologies such as DLSS, I don't think that's a good thing.
 
Just curious, where do you get that figure?

If I look at Steam Hardware survey, I see around 25% (just eyeballin' it) of the total marketshare might be DLSS compatible in some various version, and only about 1% supports the latest DLSS 3.
I wasn't eyeballing, I copied the data into excel and summed up the percentage for all nvidia gpus 2060 and up. And no, I didn't goof it up, I only counted the may column.
 
Love them or hate them, Nvidia pushes the technology forward. Easily the last 15 years, every leap in performance/features/quality, has been a new Nvidia technology. The main ones being adaptive sync, raytracing, and dlss. Another good one is the reflex technology to reduce latency.

AMD has copied Nvidias moves 2 to 4 years later on average, and in all 3 examples has done so in a less than stellar implementation.

When it comes to FSR working on everything, it does, but it doesn't work 'well'. Go try it in Jedi Survivor. You have to have FSR if you run an AMD card and enable raytracing to get playable performance. Fast enough Nvidia cards should be able to play with FSR off. I play Cyberpunk 2077 with DLSS/FSR off, and it plays fine.

What truly sucks is that AMD's meddling (removal of DLSS support) results in a game engine that cannot properly run UNLESS FSR is enabled at some level. If you have to enable it, put it at Quality at least. But I tried to play it with FSR off and it was a buggy crashy mess. Seriously, it was pretty terrible. And a far poorer experience than Fallen Order was. So ultimately you have to use FSR even if you don't need it and it hampers both visual fidelity and performance (save for curing all of the CTD's).

This is how it went with Jedi Survivor. Game comes out, launch day performance was terrible, 40 fps, peaks at 44fps, and lows dropping as low as 18. Enable FSR, same exact frames. Disable Raytracing, same exact frames. Apply some manual setting edits to the GameConfiguration.ini file buried under your user profile. BAM, all of the sudden you are getting 60+ fps with raytracing enabled and FSR disabled! It was glorious for about a week. Then the patches start coming out. Image quality takes a step back, and stability gets much worse. That is, if you still have FSR disabled. Enabled it and wow, gamer doesn't crash. It's worth noting that these issues were much less pronounced on AMD hardware... makes it look like AMD sabotaged Nvidia performance, as well as paying to have Nvidia's tech 'not supported'. It's a bad look.

I blame AMD's meddling for the problems... the dev's made a well performing and stable Fallen Order in UE4, Survivor was still UE4 but was AMD sponsored, no DLSS, very limited Raytracing features, and just a buggy mess.

"But consoles! Games are made for Consoles now and those all use AMD gpu's! So, simple they make it for a console then that's what PC gets! It's cheaper!"

There may be some truth to this, but really, for a triple A title that is going to sell 100 million copies, it doesn't matter if 80% of those sales are consoles, you still have 20 MILLION customers buying your game to play on PC, and of those between 5 and 10 million have Raytracing and DLSS capable cards. Millions of users. Saying "Oh get used to it, that is normal because that is only 20% of the market", isn't a valid excuse. A game dev making a game to release on different hardware can afford to do what is required so that all copies sold are stable, well performing games. And there's no reason that it shouldn't have varying levels of Texture quality/shadow quality/draw distance, etc. These are features that are standards in games. And the game engines all support all of these sliders.

A new game selling 100 million copies pulls in 6 Billion dollars! You trying to tell me they can't afford to put decent raytracing and DLSS feature support in? If only 5 million of those users had Raytracing and DLSS capable cards, that's 300 million dollars in sales. Lack of support is inexcusable.

The way I see it, is that AMD is holding PC gaming back. Console gaming too really. This isn't something any gamer should be ok with.
 
Except that almost every single one of those GPUs can run FSR but only a fraction, and likely a small fraction at that, can run DLSS. And beyond that, the console market dwarfs the GPU market and guess what happens to be powering those consoles?

It's amazing how much butthurt there is over nVidia being left out in the upscaling race with its proprietary upscaling. It's also quite funny that nVidia fans seem to think nVidia is the whole of the market when it's not even remotely true.

I guess DLSS is becoming a worse and worse selling point...
The minimum system requirement is a 1070Ti so of the people playing the VAST majority will be using an RTX card on PC.

As per the survey the 3090 has a higher % than the 6600.
The 84% number is impressive but the reality of new card sales is that it's almost all RTX-capable.
FSR is a visual quality downgrade in every iteration I've experienced it.

So without DLSS I'm stuck using either TAAU (if it's in there) or no upscaling which sucks unless this version of FSR removes the obvious artifacts it normally has.

I probably will just hold off playing this game for a while at this stage.
 
Last edited:
Love them or hate them, Nvidia pushes the technology forward. Easily the last 15 years, every leap in performance/features/quality, has been a new Nvidia technology. The main ones being adaptive sync, raytracing, and dlss. Another good one is the reflex technology to reduce latency.

AMD has copied Nvidias moves 2 to 4 years later on average, and in all 3 examples has done so in a less than stellar implementation.

When it comes to FSR working on everything, it does, but it doesn't work 'well'. Go try it in Jedi Survivor. You have to have FSR if you run an AMD card and enable raytracing to get playable performance. Fast enough Nvidia cards should be able to play with FSR off. I play Cyberpunk 2077 with DLSS/FSR off, and it plays fine.

What truly sucks is that AMD's meddling (removal of DLSS support) results in a game engine that cannot properly run UNLESS FSR is enabled at some level. If you have to enable it, put it at Quality at least. But I tried to play it with FSR off and it was a buggy crashy mess. Seriously, it was pretty terrible. And a far poorer experience than Fallen Order was. So ultimately you have to use FSR even if you don't need it and it hampers both visual fidelity and performance (save for curing all of the CTD's).

This is how it went with Jedi Survivor. Game comes out, launch day performance was terrible, 40 fps, peaks at 44fps, and lows dropping as low as 18. Enable FSR, same exact frames. Disable Raytracing, same exact frames. Apply some manual setting edits to the GameConfiguration.ini file buried under your user profile. BAM, all of the sudden you are getting 60+ fps with raytracing enabled and FSR disabled! It was glorious for about a week. Then the patches start coming out. Image quality takes a step back, and stability gets much worse. That is, if you still have FSR disabled. Enabled it and wow, gamer doesn't crash. It's worth noting that these issues were much less pronounced on AMD hardware... makes it look like AMD sabotaged Nvidia performance, as well as paying to have Nvidia's tech 'not supported'. It's a bad look.

I blame AMD's meddling for the problems... the dev's made a well performing and stable Fallen Order in UE4, Survivor was still UE4 but was AMD sponsored, no DLSS, very limited Raytracing features, and just a buggy mess.

"But consoles! Games are made for Consoles now and those all use AMD gpu's! So, simple they make it for a console then that's what PC gets! It's cheaper!"

There may be some truth to this, but really, for a triple A title that is going to sell 100 million copies, it doesn't matter if 80% of those sales are consoles, you still have 20 MILLION customers buying your game to play on PC, and of those between 5 and 10 million have Raytracing and DLSS capable cards. Millions of users. Saying "Oh get used to it, that is normal because that is only 20% of the market", isn't a valid excuse. A game dev making a game to release on different hardware can afford to do what is required so that all copies sold are stable, well performing games. And there's no reason that it shouldn't have varying levels of Texture quality/shadow quality/draw distance, etc. These are features that are standards in games. And the game engines all support all of these sliders.

A new game selling 100 million copies pulls in 6 Billion dollars! You trying to tell me they can't afford to put decent raytracing and DLSS feature support in? If only 5 million of those users had Raytracing and DLSS capable cards, that's 300 million dollars in sales. Lack of support is inexcusable.

The way I see it, is that AMD is holding PC gaming back. Console gaming too really. This isn't something any gamer should be ok with.
I think you may be attributing juuust a tiny tiny too much blame to AMD. If the past is any guidance, all AMD does is provide some tools, if even that, and let everyone else to their thing. It may be corporate policy, it maybe broke-ness mentality, but since forever, the routine (in waves) articles are that AMD does next to nothing for partners, incredibly, including in hardware. So if you have some kind of issues, its likely developer incompetence above all else, and most likely, LACK of AMDs involvement, if history teaches us anything.
 
I still question the value of debating over upscaling methods when the best solution is to not need it in the first place
 
I still question the value of debating over upscaling methods when the best solution is to not need it in the first place
We'll get there, eventually, but the current trend of linear increase in price with transistor node shrinkage means it will still be needed for affordable performance. The 4090 can already run things like Cyberpunk 2077 full overdrive RT without DLSS at sub-4K resolution smoothly. At 4K we're talking 40 FPS without DLSS, at worst. Hopefully with more competitors getting into the microchip manufacturing market, wafer cost can come down back to at least Pascal levels eventually. So long as TSMC remains the majority manufacturer, capacity will always come at a premium.
 
We'll get there, eventually, but the current trend of linear increase in price with transistor node shrinkage means it will still be needed for affordable performance. The 4090 can already run things like Cyberpunk 2077 full overdrive RT without DLSS at sub-4K resolution smoothly. At 4K we're talking 40 FPS without DLSS, at worst. Hopefully with more competitors getting into the microchip manufacturing market, wafer cost can come down back to at least Pascal levels eventually. So long as TSMC remains the majority manufacturer, capacity will always come at a premium.
It's pretty clear companies like Intel are making a move to deliver. Also, for corporate entities like Nvidia they like the availability of TSMC and gladly pay the price as long as their enterprise chips are pulling in investors and capital.

I think it's interesting that Intel is slated to become a consumer grade chip manufacturer for Nvidia. as Apple, Nvidia and others keep their top of line production largely on TSMC.

In truth TSMC needs to keep buying newer fabs as they come on line for large discounts on the development cost. At least that's how I see it.
 
I think you may be attributing juuust a tiny tiny too much blame to AMD. If the past is any guidance, all AMD does is provide some tools, if even that, and let everyone else to their thing. It may be corporate policy, it maybe broke-ness mentality, but since forever, the routine (in waves) articles are that AMD does next to nothing for partners, incredibly, including in hardware. So if you have some kind of issues, its likely developer incompetence above all else, and most likely, LACK of AMDs involvement, if history teaches us anything.

You are probably right.

It looks like AMD is sabotaging games. But they reality is that it is more likely that ALL of the dev studios suffer from a lack of effort to get games working well on multiple hardware platforms (save for iD). Since Nvidia is very helpful with support when the title is sponsored by them, hardware support gets the proper attention it needs. But when it is an AMD sponsored game, there's no support from AMD for the graphics hardware support/pipline (other than saying "here ya go, go download this tool"), so we get these buggy messes.

AMD doesn't really have any excuse these days (for not providing proper support). Their processors are popular and selling in higher quantities than ever before. They have money. Just not the right mindset. They barely serve as competition in the GPU market. If intel gets good at making/supporting GPU's, AMD will become the bottom tier crap. And that will be AMD's own fault. They bought ATI, and it's just been on life support ever since. No new innovative technologies, just copying Nvidia to keep up.

I still question the value of debating over upscaling methods when the best solution is to not need it in the first place
Yep. I always turn it off. Except in the case of the poor fsr implementation in Jedi Survivor, where I had to turn it on or the game would crash in the same places every time. I couldn't finish the game without turning it on.

I did combine 2 different mods for Jedi Survivor, a raytracing crashes fix + a resolution scaling fix. So now it works with raytracing on and fsr off, played for several hours last night no crashes. File is attached, rename it back to .pak place the file in \EA Games\Jedi Survivor\SwGame\Content\Paks. To uninstall it, just delete it.

Edit: The Raytracing fix did cure crashes I had during cutscenes, I can play thru the bode fight thru the credits now, but I just experienced a crash on Jedha. At least it is looking better and performing better. Dev's need to get their **** together.
 

Attachments

  • pakchunk99-Mods_RT+ResolutionScalingFixes.pak.txt
    184.4 KB · Views: 0
Last edited:
You are probably right.

It looks like AMD is sabotaging games. But they reality is that it is more likely that ALL of the dev studios suffer from a lack of effort to get games working well on multiple hardware platforms (save for iD). Since Nvidia is very helpful with support when the title is sponsored by them, hardware support gets the proper attention it needs. But when it is an AMD sponsored game, there's no support from AMD for the graphics hardware support/pipline (other than saying "here ya go, go download this tool"), so we get these buggy messes.

AMD doesn't really have any excuse these days (for not providing proper support). Their processors are popular and selling in higher quantities than ever before. They have money. Just not the right mindset. They barely serve as competition in the GPU market. If intel gets good at making/supporting GPU's, AMD will become the bottom tier crap. And that will be AMD's own fault. They bought ATI, and it's just been on life support ever since. No new innovative technologies, just copying Nvidia to keep up.


Yep. I always turn it off. Except in the case of the poor fsr implementation in Jedi Survivor, where I had to turn it on or the game would crash in the same places every time. I couldn't finish the game without turning it on.

I did combine 2 different mods for Jedi Survivor, a raytracing crashes fix + a resolution scaling fix. So now it works with raytracing on and fsr off, played for several hours last night no crashes. File is attached, rename it back to .pak place the file in \EA Games\Jedi Survivor\SwGame\Content\Paks. To uninstall it, just delete it.

Edit: The Raytracing fix did cure crashes I had during cutscenes, I can play thru the bode fight thru the credits now, but I just experienced a crash on Jedha. At least it is looking better and performing better. Dev's need to get their **** together.
I'd agree AMD isn't pushing hard enough. I do agree if Intel invests in all other aspecs of the gpus, they will find themselves in second place quickly even with mid and low offerings only. There is a lot to do and a lot to offer when it comes to gpus, starting with optimizing ( talking beyond just stable) games and drivers ( in partnerships or not). Supporting and coming up with any and all manner of hardwate acceleration apis and such for video encoding, photo editing and so on, anything will help greatly.
AMD should//should have recognize this, give up its ' professional' gpus which seem can't compete with nvidia anyway, roll all those features into 'regular' gpus, and do a big support and development department for their cards. Doing a good gpu chip, well cool, but doing that and sitting on their hands aint helping long term. I agree they have more money.
 
Become a Patron!
Back
Top