Freesync works. Nvidia's monitors are both locked to Nvidia cards for that feature, and are more expensive because of that. Plenty of freesync monitors out there with better or equivalent features. Just because more cheaper manufacturers take advantage of the Freesync feature as a selling point doesn't make G-Sync abjectly better. Just a reason to be abjectly more expensive.
Freesync works... not as well. Objectively. And subjectively when compared 1:1.
I won't debate whether I'd rather have Freesync or
nothing, but where possible I'm willing to pay for G-Sync. Of course, that's usually in a price range where that's not much more in terms of fractional increase over a hypothetical base price without Freesync, and no, I wouldn't pay double for G-Sync over Freesync if say the base price were US$100.
Caveats aside, while Nvidia has earned plenty of complaints for how G-Sync has been handled, from cost to input options to lack of AMD Freesync compatibility for most of its releases to having to have an active cooling solution that monitor manufacturers inevitably implemented poorly, G-Sync
also has the 'following' it has for the fact that Nvidia went and solved
all the problems related to how stupidly modern display protocols have been set up. And they solved them right from the beginning.
And with respect to LCD technology at least, G-Sync still has the technical edge. And let's not get into AMDs complete failure of a certification regime for Freesync; it might as well not exist, and has been functionally replaced with Nvidia's 'G-Sync Compatible' badge, because that's the only way to know if the monitor manufacturer didn't half-arse a particular revision of a particular model,
even if said monitor will only ever be used with an AMD GPU.
Last, the latest G-Sync module works with Freesync and AMD GPUs. Can't not cost more I assume, but it's an option and has value.
Like Freesync, it will be cheaper, easier to adapt, and "good enough".
Nvidia basically had to co-opt an existing feature to make DLSS work, and they had to garner developer support to do it. And they had to put together significant infrastructure on their end to support it and refine it.
These are not things AMD is known for doing well. Granted Nvidia is killing them in this regard, so perhaps AMD sees this as a justification for a
far higher investment effort than is typical, but at that point, that coin flip isn't looking so good.
The only upside here is that they'll be treading on ground that Nvidia has already paved. Call it a balanced coin flip.
nd to be honest, I'd rather have a card that's fast enough to not need to turn on tricks like this.. although such tricks are nice at extending out the lifespan of a card.
This is... a catch-22 that could easily dive deep into philosophy.
So look at it this way: if power / heat / noise / battery life aren't a concern you probably want the faster hardware. But if they are a concern, you want more efficient hardware, which is what DLSS does, big picture.
It's not that I don't want a mobile 3090, for
purely hypothetical example, it's that I can get the same performance utilizing DLSS with a mobile 3070, and I can get it with a quiet 15" notebook instead of a hulking 17" DTR cooled by a leaf blower.
Obviously there are tradeoffs, but unlike hardware, you can just not use DLSS!