GIGABYTE Radeon RX 6500 XT EAGLE 4G Review

Brent_Justice

Administrator
Staff member
Joined
Apr 23, 2019
Messages
931
Points
93
banner-1024x387.png




Introduction



Today, AMD is launching its new Radeon RX 6500 XT GPU, which is a completely new ASIC built from the ground-up on TSMCs new N6 manufacturing node.  This is the world’s first consumer 6nm GPU, and while that is surely impressive, what about the rest of the specs?  Are they enough to justify the new MSRP and pricing realm we find ourselves in today for what the 6500 XT offers?



First and foremost, the AMD Radeon RX 6500 XT is AMD’s answer to the entry-level to mainstream video card level market.  Think of the Radeon RX 6500 XT as an entry-level video card if you are just getting into gaming, or want to upgrade from an older video card like the Radeon RX 570 or GeForce GTX 1650.  In fact, you might even think of it as a means to move from integrated graphics performance to a discrete video card level graphics performance, at least, the most cost-effective way to do so. 



AMD...

Continue reading...
 
Nice that you got a review sample - that bodes well for availability.

It may have drawbacks, but at this point I'd say anything that can stay on the shelves for near MSRP is going to be a huge improvement.
 
More like 150% of MSRP for the out of stock listings I've seen - but that's still an improvement! Somehow!

I also appreciate the testing of the transcoding... err, 'limited decoding' block. I generally disapprove of Intel's IGP-less 'F' SKUs because they cut off Quicksync along with the IGP, as well as AMDs release of mostly GPU-less desktop parts.

I'm not about to spit on working, available GPUs, but I think AMD cut just a few too many corners here - and they still didn't get it down to the sub-75W level of power draw needed for powering cards directly through the PCIe slot.
 
Wow, that was a little worse than I expected, especially considering it isn't selling for $199 hardly anywhere.
 
I'm not about to spit on working, available GPUs, but I think AMD cut just a few too many corners here - and they still didn't get it down to the sub-75W level of power draw needed for powering cards directly through the PCIe slot.

Wow, that was a little worse than I expected, especially considering it isn't selling for $199 hardly anywhere.

Yeah, I think in "Days of Yore" this would have been the bottom tier GPU and gone in the $100-$120 budget range. But today - yeah, anything is better than nothing and the only real competition is whatever you can find scalped on Ebay.
 
It is nice to see that you tested using FSR where the option was available. This is very useful, keeping in mind it is similar to about to be released RSR

Do you plan on a separate testing for pci3 vs pci4 (idea is to see how the playable settings get reduced in pci3 vs pci4) ?
 
Do you plan on a separate testing for pci3 vs pci4 (idea is to see how the playable settings get reduced in pci3 vs pci4) ?
There was the 3DMark comparisons on page 4, it goes through a couple of various PCI scenarios, but it only looks at changes in 3DMark. So there is some data.

I'll give you - that isn't the same thing as gameplay. Might be neat to see how a couple of titles fair to see if there are major changes outside of a canned benchmark designed to exacerbate differences; I don't know if it warrants going through the entire suite of tests again though.
 
There was the 3DMark comparisons on page 4, it goes through a couple of various PCI scenarios, but it only looks at changes in 3DMark. So there is some data.

I'll give you - that isn't the same thing as gameplay. Might be neat to see how a couple of titles fair to see if there are major changes outside of a canned benchmark designed to exacerbate differences; I don't know if it warrants going through the entire suite of tests again though.
Agreed... I think we can draw a relative conclusion from Brent's data as to how gaming will be. Don't expect to have the settings tuned up too high..
 
Gamers Nexus was also similarly displeased. In a normal world, this would be a $100-140 card. 1080p gaming for the masses. This thing selling at retail for $250-300 should be a war crime.
 
Very disappointing overall on the whole combination. At $200 MSRP, if even readily available for that price, with 1080p medium to low settings to make usable for current generation games does not look like a card that will stand the test of time in the end. For me, this is the worst GPU for this generation and makes even Nvidia look good, which I thought would be impossible for AMD to do (I was wrong). I do like the review overall, well done exposing the usability of this card with lower settings, FSR options and using Rebar which is readily available for usage on AMD and Intel systems. On Ebay one can buy used AMD Fury's for less than $150 and have better performance, abeit at a higher power level and heat unless your lucky and nab a Nano for less than $150 which are 175w about cards.
 
It's pretty awful as far as I am concerned. It's a card that's basically only about as fast as a $200 card from five years ago and gimped in a way that will make it age far worse. It's already being scalped for $500+ on eBay. Meanwhile, on eBay there are cheaper alternatives on the used market that will provide a vastly superior gaming experience. Even at $200 I think it's a non-starter given it's essentially a waste of engineering time and effort that does nothing to advance the already stagnant segment.

Frankly, I see absolutely no reason for this GPU to exist in its current form.

That's my $0.02 anyway.
 
This I find amusing.

I think this is a no-win situation for AMD really.

Had they put more than 4G of VRAM on here, it would just be mining fodder - having only 4G is the only chance it had of staying remotely available. It's not like the "hash limiter" nVidia has out there does anything to deter miners or make the cards any more available.

I rail against nVidia for their hash limiter being just a paper shield, and for putting out old video cards again. That doesn't really fix anything. And I'll say the same thing here - as Dan_D says, this isn't any better than just re-releasing a RX580, cutting the VRAM capacity, but leaving the price the same. But availability would be a step in the right direction.

I don't know what the right answer is to be truthful. But I've seen an awful lot of what I'm pretty sure are the wrong answers.
 
Last edited:
Ethereum is memory bandwidth sensitive, the 6500XT measly 64bit memory bus probably makes it a worthless mining even at 8gb onboard memory. The 6500XT I would say has no reason to exist, it does everything poorly.
 
So... if you consider the current street prices (and forget history) for all currently produced GPUs, the 6500 XT family is priced for its performance right where it should be, assuming that street price ends up in the $200-300 range.

Going by a browsing of sold ebay items, RTX 3060's are selling for $700-800, RX 6600 XTs are selling for about $600 and the 6600 non-XT are selling for $600. Of course, the RTX 3050 is TBD with respect to street pricing. Therefore, in theory, if the street price even settles at $300-350, you're still paying HALF what you would for the next step up, which gives you a choice - Am I OK with low-medium 1080p settings in current AAA titles and high settings for older/low graphics intensive games? Or should I spend twice as much for a better 1080p experience?

If you go by historic prices and market placement, this is a low $100 MSRP card (and I could go on and on about where all of these cards _should_ be priced in the current reality). Unfortunately, this is the reality we've been dealing with for a couple of years and I don't see that changing anytime soon unless maybe the Eth price keeps cratering and stays cratered like it has been this week.
 
And there are no where to be found, fail of all fails. The cheapest, easiest stripped down card ever produced for this generation and AMD cannot make them or AIBs in sufficient numbers for the desperate of desparates of the PC gamers, strapped for some kind of solution. AMD cannot manufacture or managed to get manufacture their products -> pretty big red flag. I hope Intel with their actual manufacturing capability can come through. Few more years of this, I see the killing of PC gaming, PCVR etc. Shift to services, as in computing/gaming just shifting to cloud based services and specialty hardware like game consoles and more capable TVs making PCs much less relevant. Where your processing needs are fully fulfilled and controlled by big tech companies, as long as you go along with the party line and not get kicked off.
 
I hope Intel with their actual manufacturing capability can come through.
They're stealing it from AMD at TSMC :ROFLMAO:

Still, Intel is probably the only party that can do anything about prices - assuming they bring real availability, they're probably going to want to buy market share by sacrificing margins for their first few 'enthusiast' generation releases, and that's going to put downward pressure on Nvidia and AMD.

Supposing Intel does that, I don't expect them to continue forever as they do very much like their margins, but they may very well help reset prices somewhat.
 
Become a Patron!
Back
Top