AMD Radeon RX 6500 XT Could Be a “Disaster in the Making” for PCIe 3.0 Systems Due to Memory and Bandwidth Limitations

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,302
Points
83
amd-radeon-rx-6500-xt-red-black-bg-1024x576.jpg
Image: AMD



TechSpot has shared new benchmarks that suggest the AMD Radeon RX 6500 XT could be a disappointment for gamers who will be running the graphics card in PCIe 3.0 mode.



While third-party benchmarks for the Radeon RX 6500 XT will not be shared until later this week, TechSpot has teased the potential performance of AMD’s new $200 budget option by benchmarking its predecessor, the Radeon RX 5500 XT, under various PCIe bandwidth configurations. Assuming that the cards are truly as similar as TechSpot believes, Radeon RX 6500 XT users may see performance decreases of as much as 43% (1% min FPS) in select games such as Shadow of the Tomb Raider when the card is run under PCIe 3.0 mode (x4, 4 GB/s vs. 8 GB/s).



The benchmarks suggest that AMD could have avoided this problem partially by matching the memory capacity of the Radeon RX 6500 XT with that of the...

Continue reading...
 
Last edited by a moderator:
This sounds like a driver issue moreso than a real bandwidth issue. Seems like there was something similar earlier with an AMD card and it ended up being fixed up.
 
So what's the theory here, that it needs to load textures from RAM mid game, and this is what is resulting in the 1%fps drops?

Sounds possible I guess, but it looks like he is only running it in x8 mode.

Why would you ever do that? If - as he concludes - x8 gen 4 is fine, 16x Gen 3 should be just as fine.

No one should be running a GPU under 16x.

That, and I suspect this might be a corner case. For the overwhelming majority of titles out there 4GB should be more than enough for 1080p for now, and by the time it isn't, this GPU probably won't be fast enough to run those titles anyway.

It almost looks like he is intentionally sabotaging the card by running it in x8 mode in order to try to make a point.
 
So what's the theory here, that it needs to load textures from RAM mid game, and this is what is resulting in the 1%fps drops?

Sounds possible I guess, but it looks like he is only running it in x8 mode.

Why would you ever do that? If - as he concludes - x8 gen 4 is fine, 16x Gen 3 should be just as fine.

No one should be running a GPU under 16x.

That, and I suspect this might be a corner case. For the overwhelming majority of titles out there 4GB should be more than enough for 1080p for now, and by the time it isn't, this GPU probably won't be fast enough to run those titles anyway.

It almost looks like he is intentionally sabotaging the card by running it in x8 mode in order to try to make a point.
The rumor is that 6500 xt is pcie 4.0 x4. So at maximum it has same bandwidth as pcie 3.0 x8. Worst case scenario someone puts it into a pcie 3.0 slot.
 
The rumor is that 6500 xt is pcie 4.0 x4. So at maximum it has same bandwidth as pcie 3.0 x8. Worst case scenario someone puts it into a pcie 3.0 slot.
TechPowerUp is saying PCI 4.0 x8 - that may not be accurate tho

 
My video card might get more FPS on a wider buss running st a faster clock rate. And if I run it on a smaller buss at a lower clock rate the performance will suffer.

I mean the guy isn't wrong as long as the larger faster buss is supported by card and system. By and large these cards may be refreshing Systems with only pcie 3.x slots so this **** won't even be a theoretical.

Man my card would be faster If it was on a better BUS. Yea and my engine would be faster if my intake. Fuel delivery, and exhaust were better.

I just don't get it. Maybe some real world comparison for motherboards with like cpus on pcie 4 and pcie3 mode to see if this makes a difference on lower tier cards.
 
This is an 'old' card that was benchmarked.
So unless AMD fixed it and then broke it again after the 5500XT was released, I can't see this as being a driver issue.
The results of the 5500XT are similar to launch.
 

According to this, it only uses 4 lanes. When you stick this card on PCIe 3.0, it gets crippled. In many cases the RX 580 is coming out ahead. I think the RX 580 also has more vRAM. There should never be situations where a low-end card from today is beaten by a low-end card from 5 years ago (and lets not forget that the RX 580 is a refresh of the RX 480). Good gawd. Well, at least it beats the GTX 970 (barely), a mid-range card from almost 8 years ago. Not in GTA V though.

1440p results just make those older cards like the RX 580 and GTX 970 look even better than the 6500 XT, sheesh. Although I don't know why you would have cards like these and try to run games at 1440p, but interesting to see for testing purposes.

Well, at least the 6500 XT used the least electricity.

Who decided to let this card come to market like this? What a waste of a product. I don't understand how this card was released for sale. This doesn't seem like a real product. More like an April Fool's joke.
 
Title is written backwards.
It’s not a disaster for PCI 3 systems - they continue to work fine as they have for years.
It’s that the 6500 is a disaster of a card.
 
1440p results just make those older cards like the RX 580 and GTX 970 look even better than the 6500 XT, sheesh. Although I don't know why you would have cards like these and try to run games at 1440p, but interesting to see for testing purposes.
I upgraded to a GTX970 while playing at 1600p... but that was back when they were new! Even got a second for SLi, back when that was a thing :)

I did play through The Outer Worlds at 1440p on one of those GTX970s, and the experience was barely passable, regularly dropping to just below 30FPS. I think that I made it work due to my own stubbornness more than anything else.

Who decided to let this card come to market like this? What a waste of a product. I don't understand how this card was released for sale. This doesn't seem like a real product. More like an April Fool's joke.
So, it's basically a mobile silicon spin that's been dropped into a desktop form-factor. It's entirely stripped down for that purpose, where PCIe 4.0 x4 is apparently the norm.
 
So, it's basically a mobile silicon spin that's been dropped into a desktop form-factor. It's entirely stripped down for that purpose, where PCIe 4.0 x4 is apparently the norm.

Interesting.

In my old Latitude E6540 with an i7-4810mq and a Radeon HD 8790M, the Radeon got 8 lanes, but I guess on newer machines where gen 4 PCIe is available, they have justified dropping that down to 4x.

In a laptop this isn't going to be a problem. The GPU will be soldered to the motherboard, so you won't have people sticking it in a gen 3 board, but in the desktop form factor, this is a risk.
 
What does a 580 cost these days and where could you buy it? This would be a terrible launch if you could just buy any model of card you wanted. As far as I can tell, the real competition is:

1650: $319
1050ti: $299
6500XT: $259
RX 550: $229


Given the real world availability, the 6500XT isn't.... terrible?

In all honesty, the name is the real issue. If they called it the 6400XT, for example, people probably wouldn't be complaining as much.
 
Title is written backwards.
It’s not a disaster for PCI 3 systems - they continue to work fine as they have for years.
It’s that the 6500 is a disaster of a card.

I think it is a fine card, provided you use it in a gen 4 system.

They named it incorrectly though. It should have been called the 6400 or something lower.

Price/performance in this market is actually fairly decent for a budget card for 1080p use.
 
I upgraded to a GTX970 while playing at 1600p... but that was back when they were new! Even got a second for SLi, back when that was a thing :)

I did play through The Outer Worlds at 1440p on one of those GTX970s, and the experience was barely passable, regularly dropping to just below 30FPS. I think that I made it work due to my own stubbornness more than anything else.
I got a GTX 970 at launch, and I used it for 1200p gaming. I didn't upgrade to 1440p until after I got 1080 Ti 5 years later (and that was only cuz my 1200p monitor died after 9 years).

So, it's basically a mobile silicon spin that's been dropped into a desktop form-factor. It's entirely stripped down for that purpose, where PCIe 4.0 x4 is apparently the norm.
Oh well dang.

In a laptop this isn't going to be a problem. The GPU will be soldered to the motherboard, so you won't have people sticking it in a gen 3 board, but in the desktop form factor, this is a risk.
Indeed.

If they called it the 6400XT, for example, people probably wouldn't be complaining as much.
They named it incorrectly though. It should have been called the 6400 or something lower.
Yeah I agree.
 
Would 8gb 6500 XT have less of issues in PCI x3 motherboards as the trips to main memory will decrease due to the increased VRAM capacity !?

Sapphire intros Radeon RX 6500 XT with 8GB memory​

Sapphire Navi 24 gets 8GB memory​

Sapphire introduces Radeon RX 6500 XT entry-level card with 8GB of memory.


 
Would 8gb 6500 XT have less of issues in PCI x3 motherboards as the trips to main memory will decrease due to the increased VRAM capacity !?

Sapphire intros Radeon RX 6500 XT with 8GB memory​

Sapphire Navi 24 gets 8GB memory​

Sapphire introduces Radeon RX 6500 XT entry-level card with 8GB of memory.



I think it would help, but it really depends on the title and engine.

The cynic in me thinks that if you are traveling across the PCIe bus to grab textures etc, mid level, you have already failed, regardless of how many PCIe lanes you have available, as this will always have a performance penalty.

Ideally everything should be loaded on launch. Having less PCIe bandwidth would result in longer load times, but that is less of a problem.
 
For gaming, it would help a bit - or should, since we'd want to test it.

Unfortunately, it's still a castrated GPU, which limits uses other than gaming considerably.
 
Become a Patron!
Back
Top