AMD Launching FidelityFX Super Resolution on June 22 for Radeon and GeForce Graphics Cards

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,595
Points
113
amd-fidelityfx-super-resolution-dna-background-1024x576.jpg
Image: AMD



AMD has finally lifted the veil on its highly anticipated NVIDIA DLSS competitor, FidelityFX Super Resolution (FSR). As explained by Scott Herkelman during tonight’s Computex 2021 event, red team’s spatial upscaling algorithm will greatly improve performance and allow gamers to choose between four quality modes that range from ultra quality for greater visual fidelity or performance for higher frame rates. Launching on June 22 for select titles, AMD FidelityFX Super Resolution will be available not only for Radeon products (i.e., Radeon RX 600, RX 5000, RX 500, and RX Vega Series, and all Ryzen processors with Radeon graphics) but NVIDIA’s GeForce 10, 20, and 30 Series graphics cards as well...

Continue reading...


 
Kudos to AMD for making it compatible with Nvidia's products. Can't help but feel this is going to repeat Freesync vs Gsync - Gsync's the better tech but hardware limited, while Freesync is good enough and more widely available.
 
I don't know about that. On the software side, AMD often embraces open source or brand agnostic solutions which fail to achieve the market saturation of NVIDIA's solutions.
 
I don't know about that. On the software side, AMD often embraces open source or brand agnostic solutions which fail to achieve the market saturation of NVIDIA's solutions.
I recall nividia did a shader based DLSS for Control known as "DLSS 1.5" and it was pretty good, but DLSS 2.0 is way better.
As long as it performs and looks comparably to DLSS2.0 I think it AMD will be fine.
 
With the market the way it is anyone will be glad to have something to improve performance as long as it doesn't kill fidelity. I'm all for this and I already use an amd card. Just because I got what I could get when I could get it.
 
"Spatial Upscaling" is apparently the buzzword here.

AMD is supporting a pretty good backlog of hardware - recent Polaris (500 series and up), all Vega and all RDNA. So this is a software solution, not requiring any dedicated hardware. I didn't catch how far back they are supporting nVidia hardware, but at least as back as Pascal (the 1060 was demo'ed running it)

I don't think it needs to beat DLSS (I mean, really, who can beat something that is "better than native" (*chuckle*)). I don't even really think it needs to exist honestly - it sounds like yet another buzzword driven marketing point that only exists to be able to counter the competition -- but then again, I think the same thing about DLSS - which was only brought about because RT performance was so utterly in the hole they needed something buzzworthy to keep it pertinent.

It still requires integration at the developer level. Adoption here will more or less come down to if AMD provides compensation for the integration into a title -- they will probably pay a few AAA games to drop it in, help a few prominent indies drop it in pro bono, and hope it gains some traction to proceed on it's own after that: more or less the exact same thing nVidia is doing with DLSS.

I can't really see anything in compressed Youtube videos on how it works - but I can't see anything on DLSS either. So I have to reserve judgement on a technical perspective. I ~expect~ it will be about the same as Epic's most recent Unreal 5 announcement. (Temporal Super Resolution - another good buzzword)
 
I guess we need to create some nomenclature to allow us to really talk about these techniques as a whole.

All of these techniques involve some form of 'upscaling' beyond the static techniques of the past. Maybe call them 'smartscaling'?

I will say that I am very glad that AMD is working this too. Even without the resources that Nvidia has poured into DLSS, they're still making notable progress toward workable solutions that are more universal; they'll probably run on Intel's discrete GPUs as well, and perhaps even Intel's IGPs, where larger compromises in image quality are more acceptable!
 
Remember when the big trick was to render higher and downscale? That was basically the first form of Anti-Aliasing, and you had things like nVidia DSR and AMD VSR. Now we are working upscaling. How the pendulum swings....
 
Remember when the big trick was to render higher and downscale? That was basically the first form of Anti-Aliasing, and you had things like nVidia DSR and AMD VSR. Now we are working upscaling. How the pendulum swings....
CRTs did this more or less natively back in the day. They still look 'better' for certain things, including console games in the pre-digital era.

Where these 'smartscaling' techniques are going to become necessary is with the proliferation of monitors / other displays that exceed the normal human vision 100dpi average. Not a 'today' problem but one that's going to need a solve soon.
 
Remember when the big trick was to render higher and downscale? That was basically the first form of Anti-Aliasing, and you had things like nVidia DSR and AMD VSR. Now we are working upscaling. How the pendulum swings....

I bet when midrange GPUs handle native 4k without breaking a sweat, we will see a return of the render high then downscale.
 
Last edited:
"Like other upscaling techniques, FSR will live or die by how clean of an image it produces."

"In our pre-briefing with AMD, the company did confirm that FSR is going to be a purely spatial upscaling technology; it will operate on a frame-by-frame basis, without taking into account motion data (motion vectors) from the game itself.

For GPU junkies, many of you will recognize this as a similar strategy to how NVIDIA designed DLSS 1.0, which was all about spatial upscaling by using pre-trained, game-specific neural network models. DLSS 1.0 was ultimately a failure – it couldn’t consistently produce acceptable results and temporal artifacting was all too common. It wasn’t until NVIDIA introduced DLSS 2.0, a significantly expanded version of the technology that integrated motion vector data (essentially creating Temporal AA on steroids), that they finally got DLSS as we know it in working order."

"Spatial is a lot easier to do on the backend – and requires a lot less work from developers – but the lack of motion vector data presents some challenges. In particular, motion vectors are the traditional solution to countering temporal artifacting in TAA/DLSS, which is what ensures that there are no frame-by-frame oddities or other rendering errors from moving objects."
From: https://www.anandtech.com/show/1672...x-super-resolution-open-source-game-upscaling
 
"Like other upscaling techniques, FSR will live or die by how clean of an image it produces."

"In our pre-briefing with AMD, the company did confirm that FSR is going to be a purely spatial upscaling technology; it will operate on a frame-by-frame basis, without taking into account motion data (motion vectors) from the game itself.

For GPU junkies, many of you will recognize this as a similar strategy to how NVIDIA designed DLSS 1.0, which was all about spatial upscaling by using pre-trained, game-specific neural network models. DLSS 1.0 was ultimately a failure – it couldn’t consistently produce acceptable results and temporal artifacting was all too common. It wasn’t until NVIDIA introduced DLSS 2.0, a significantly expanded version of the technology that integrated motion vector data (essentially creating Temporal AA on steroids), that they finally got DLSS as we know it in working order."

"Spatial is a lot easier to do on the backend – and requires a lot less work from developers – but the lack of motion vector data presents some challenges. In particular, motion vectors are the traditional solution to countering temporal artifacting in TAA/DLSS, which is what ensures that there are no frame-by-frame oddities or other rendering errors from moving objects."
From: https://www.anandtech.com/show/1672...x-super-resolution-open-source-game-upscaling


Well there were a lot of things different with DLSS 1.0 than 2.0 — not the least of which was it required an offline neural net to train for specific resolutions… so I don’t know that I would infer anything based on that.

I would point, in your defense, Epic is using a temporal algorithm in their implementation as well.

But, as always, the proof will be in the pudding.
 
I look at Epic's initiative as being in service of their primary customers: console developers.

I expect that it will fall short of what Nvidia can achieve, since competing with DLSS isn't their goal. Rather, they're trying to make games look better on consoles that fall back into potato-grade graphics levels within a year of their release.

Same thing for AMDs technology really. AMD really, really needs to be targeting this stuff at their APUs, and vice-versa.
 
That's my birthday.... that's all I saw. Not like I have anything that will benefit from this news. Maybe the 2400G, but, not holding my breath.
I stick to playing Minesweeper these days. That and crossword puzzles. Between macrame classes of course.

Used to buy cars cheaper than current priced GPU's. Running ones even.
 
Become a Patron!
Back
Top