Radeon RX 6000 Series Offers Better Performance per Dollar than NVIDIA GeForce Graphics Cards: AMD

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,595
Points
113
Frank Azor, Chief Architect of Gaming Solutions & Marketing at AMD, has tweeted an image that suggests red team's family of Radeon RX 6000 Series graphics cards offers better performance per dollar than NVIDIA's competing GeForce products for gamers.

Go to post
 
I think that a refreshed set of numbers with current bios and driver updates to show the current state of AMD vs NVIDIA card performance would be interesting.

Wete at s point now where data from 6 months ago doesn't reflect the performance improvements of AMD cards. I can not speak to Nvidia cards. I'm curious to see how they have improved as well. (I suppose once the new set promising double digit performance improvements hits.
 
I think that a refreshed set of numbers with current bios and driver updates to show the current state of AMD vs NVIDIA card performance would be interesting.

Wete at s point now where data from 6 months ago doesn't reflect the performance improvements of AMD cards. I can not speak to Nvidia cards. I'm curious to see how they have improved as well. (I suppose once the new set promising double digit performance improvements hits.
It should be pretty easy to predict, though - with lagging RT performance and no real challenger for DLSS, AMD is at a two-front disadvantage, and those are hardware limitations that they're not going to be remedying for RX6000.

A bigger concern for consumers is just whether those make any difference, as well as software support for AMD for various content-creation and say AI / ML workloads.

Being faster in rasterization workloads and generally being available with more VRAM are nice, but they're not the whole picture.
 
with lagging RT performance and no real challenger for DLSS
I think one of those is accurate - the other not so much any longer. I'd also question just how pertinent RT performance really is in most gaming scenarios. Sure, there are a few games that use it, but it isn't enough to make me want to throw a lot of extra money at it. At least not yet, in these early generation titles and hardware.

I certainly prioritize rasterizing performance over all else when I look at gaming --- RT performance is just a bonus, as is DLSS/FSR -- not all games use those, but almost every game uses rasterizing and will continue to do so into the foreseeable future. That said, up until very recently the only thing that outweighed rasterizing performance was actual availability: that appears to at least be getting improved on all fronts.
 
OH Wait a Minute! Frank Azor

He was the dbag who was talking smack about availability and told everyone they just needed to learn to F5
 
I'd just like to see the games the guy chose to represent FPS per dollar spent.

For almost a year, AMD has been like vaporware............

The latest Steam Survey lays it all out there pretty well.
The WHOLE of AMD leads Intel integrated graphics by about 0.4%
80% of all users have nvidia GPU, but in the entire April survey there is NO meaningful percentage of 3xxx GPUs, nor is there a mention of AMD 6xxx useage at all above 0.05%

Where are all the expensive cards going? mining?
 
I think one of those is accurate - the other not so much any longer. I'd also question just how pertinent RT performance really is in most gaming scenarios. Sure, there are a few games that use it, but it isn't enough to make me want to throw a lot of extra money at it. At least not yet, in these early generation titles and hardware.

I certainly prioritize rasterizing performance over all else when I look at gaming --- RT performance is just a bonus, as is DLSS/FSR -- not all games use those, but almost every game uses rasterizing and will continue to do so into the foreseeable future. That said, up until very recently the only thing that outweighed rasterizing performance was actual availability: that appears to at least be getting improved on all fronts.
DLSS isn't a bonus. Basically it boils down to this:

Here is ray tracing - it's cool and can make a game look better (arbitrary rating, it looks 10 "points" better). But our stuff isn't fast enough to actually use it, so here is a work around to make your better looking options look worse (arbitrary deduction of DLSS -8 "points"). Net result of DLSS? Why bother.
 
Remember when performance was so good from video cards that they were touting being able to run a higher than native resolution then scale then image down to your native resolution because of 'sharpness' or something? This is the inverse of that.
 
Definitely interesting perspective shared by Frank Azor of AMD


The 6650XT & 3060ti officially have the same "msrp" But one is classified as 1440p card while the other is classified as 1080p !!

To put the question, the other way round, if the 3060ti starts selling for $400, then what should be the street price of 6650xt

Screenshot_20220517-220915_Opera.jpg
 
DLSS isn't a bonus. Basically it boils down to this:

Here is ray tracing - it's cool and can make a game look better (arbitrary rating, it looks 10 "points" better). But our stuff isn't fast enough to actually use it, so here is a work around to make your better looking options look worse (arbitrary deduction of DLSS -8 "points"). Net result of DLSS? Why bother.
This would be true if DLSS were only useful with RT - however, it's useful everywhere. RTX2060? RTX3050? 8K monitor? Laptop with dGPU?

It's not always worth the tradeoff, but the benefit is there and is real.

Remember when performance was so good from video cards that they were touting being able to run a higher than native resolution then scale then image down to your native resolution because of 'sharpness' or something? This is the inverse of that.
Well, supersampling is still being touted in various forms - consider the vast majority of games that remain popular or are becoming popular that do not really stress graphics performance.


Now, consider being a gamer that plays games that fall into both of the above categories.

Definitely interesting perspective shared by Frank Azor of AMD


The 6650XT & 3060ti officially have the same "msrp" But one is classified as 1440p card while the other is classified as 1080p !!

To put the question, the other way round, if the 3060ti starts selling for $400, then what should be the street price of 6650xt

View attachment 1599
Remember that marketing slides are produced by marketing departments.
 
Agreed - and that's a lot of games, and a lot of gamers. Fair or not, there's real inertia behind DLSS.
Not to be contrary... ok to be contrary... sure Nvidia has some movement on that front. now that FSR is more mature... AND a solution that can be used on Xbox and Playstation current generation hardware... you don't think it will see speedy adoption?
 
Agreed - and that's a lot of games, and a lot of gamers. Fair or not, there's real inertia behind DLSS.
Yeah, there's a good number of games. 129, by last count I could find (here). A lot of them are heavily played AAA games, like Fortnite, Call of Duty, CP2077, and Minecraft.

For a tech that's hardware locked to generations of GPUs that have been out of reach for most of their release, that isn't shabby. I will give you that nVidia has a very dominiant place in gamer hardware right now, and a lot of gamers are being exposed to DLSS

But.

DLSS just passed it's third Birthday (released Feb 2019). There's been a lot of games released since 2019, and apart from some large and notable AAA releases (which likely got promotional support from nVIdia), it hasn't seen broad adoption. 129 is just a drop in the bucket compared to the more than 50,000 games total listed on Steam. I don't know that I'd call it inertia just yet. Promising, yes, but for technology that has drop-in support for both Unreal and Unity, I would expect a lot more support, and lately it seems the buzz has worn off and it just isn't getting picked up. Maybe I'm just not in the right circles to be noticing the buzz, though.

Maybe as we see GPU availability come back to the realm of sanity, and the mid-lower tiers actually become affordable and attainable (you know, the ones that most people were buying in the first place -- and if those support DLSS.

~~~~~

Some interesting numbers, courtesy of the ever controversial Steam Hardware Survey, April '22 edition. I will only mention one of my takeaways - I'll let everyone come to their own conclusions regarding this, or just disregard it based on the source as you see fit.

The Top GPU is the 1060, at 7.15%.

#2 and #3 are the 1660 and 1050Ti, at 6.48% and 5.63%, respectively.

Of the top 10 GPUs, the most capable / expensive of them all is the 3060, placed at #10, with 2.18%

Of the top 10 GPUs installed, only 3 are DLSS capable.

The top 10 GPUs represent a combined ~40% of the total. Shares per card plunge to the 1% range after this.

The top 20 GPUs represent a combined ~55% of the total. Shares per card plunge to <1% range shortly after this.

Of all GPUs, approximately 21.5% are DLSS capable. My only take away is this - This number is much higher than I expected it to be (although no where near what it should be, given that a near-bottom tier card three generations old is the #1 card), and does give some evidence that DLSS ~could~ pick up. I just don't see huge adoption in developers to support that yet though. Promising, but not critical mass yet.
 
Become a Patron!
Back
Top