Love them or hate them, Nvidia pushes the technology forward. Easily the last 15 years, every leap in performance/features/quality, has been a new Nvidia technology. The main ones being adaptive sync, raytracing, and dlss. Another good one is the reflex technology to reduce latency.
AMD has copied Nvidias moves 2 to 4 years later on average, and in all 3 examples has done so in a less than stellar implementation.
When it comes to FSR working on everything, it does, but it doesn't work 'well'. Go try it in Jedi Survivor. You have to have FSR if you run an AMD card and enable raytracing to get playable performance. Fast enough Nvidia cards should be able to play with FSR off. I play Cyberpunk 2077 with DLSS/FSR off, and it plays fine.
What truly sucks is that AMD's meddling (removal of DLSS support) results in a game engine that cannot properly run UNLESS FSR is enabled at some level. If you have to enable it, put it at Quality at least. But I tried to play it with FSR off and it was a buggy crashy mess. Seriously, it was pretty terrible. And a far poorer experience than Fallen Order was. So ultimately you have to use FSR even if you don't need it and it hampers both visual fidelity and performance (save for curing all of the CTD's).
This is how it went with Jedi Survivor. Game comes out, launch day performance was terrible, 40 fps, peaks at 44fps, and lows dropping as low as 18. Enable FSR, same exact frames. Disable Raytracing, same exact frames. Apply some manual setting edits to the GameConfiguration.ini file buried under your user profile. BAM, all of the sudden you are getting 60+ fps with raytracing enabled and FSR disabled! It was glorious for about a week. Then the patches start coming out. Image quality takes a step back, and stability gets much worse. That is, if you still have FSR disabled. Enabled it and wow, gamer doesn't crash. It's worth noting that these issues were much less pronounced on AMD hardware... makes it look like AMD sabotaged Nvidia performance, as well as paying to have Nvidia's tech 'not supported'. It's a bad look.
I blame AMD's meddling for the problems... the dev's made a well performing and stable Fallen Order in UE4, Survivor was still UE4 but was AMD sponsored, no DLSS, very limited Raytracing features, and just a buggy mess.
"But consoles! Games are made for Consoles now and those all use AMD gpu's! So, simple they make it for a console then that's what PC gets! It's cheaper!"
There may be some truth to this, but really, for a triple A title that is going to sell 100 million copies, it doesn't matter if 80% of those sales are consoles, you still have 20 MILLION customers buying your game to play on PC, and of those between 5 and 10 million have Raytracing and DLSS capable cards. Millions of users. Saying "Oh get used to it, that is normal because that is only 20% of the market", isn't a valid excuse. A game dev making a game to release on different hardware can afford to do what is required so that all copies sold are stable, well performing games. And there's no reason that it shouldn't have varying levels of Texture quality/shadow quality/draw distance, etc. These are features that are standards in games. And the game engines all support all of these sliders.
A new game selling 100 million copies pulls in 6 Billion dollars! You trying to tell me they can't afford to put decent raytracing and DLSS feature support in? If only 5 million of those users had Raytracing and DLSS capable cards, that's 300 million dollars in sales. Lack of support is inexcusable.
The way I see it, is that AMD is holding PC gaming back. Console gaming too really. This isn't something any gamer should be ok with.