AMD Radeon RX 6700 XT Video Card Review

My local Microcenter (Dallas) has the Sapphire for $479

[/URL]

I don't think this would be an upgrade from my 1080ti (more like a side-grade) so I am going to pass.
Yeah those 1080 Ti's have aged pretty well. They were the first x80 Ti gen I ever dug deep in order to get after years of SLI and I'll always aim for the same for upgrades. Just about anything else ends up being a near sideways step. I still remember reading how people loved their 780 and 980 Ti's and now I know why. I've got my mining these days but occasionally I'll stop it and bench things in 1440p and smile. Such a great card.
 
Well I did see a few two-fan versions running for the $479. Triple fans seem to all be that price and oddly enough that Gaming X but the gaming X is a bit of MSI's premium side of things.
I can't see how a third fan adds almost $300 to the cost of a card! Baked in goodness adding to approx $100 over standard card I can swallow, but $300 over?? Nope... I find it ridiculous..
 
I can't see how a third fan adds almost $300 to the cost of a card! Baked in goodness adding to approx $100 over standard card I can swallow, but $300 over?? Nope... I find it ridiculous..
I agree. I was just stating that seems to be the main dividing line for the prices. Sometimes though that 3rd fan is because of a custom PCB with extra power but I can't imagine this card should ever have enough needs for such a thing. A 6800 or 6900, sure to push the envelope but at this tier, no way.
 
Yeah those 1080 Ti's have aged pretty well. They were the first x80 Ti gen I ever dug deep in order to get after years of SLI and I'll always aim for the same for upgrades. Just about anything else ends up being a near sideways step. I still remember reading how people loved their 780 and 980 Ti's and now I know why. I've got my mining these days but occasionally I'll stop it and bench things in 1440p and smile. Such a great card.
I'd felt a little sorry that I grabbed a 970 before the 980Ti hit. I grabbed a second 970 instead, which at the time worked pretty well, but man... I was ready to just run one card again.

When the 1080Ti hit, while I wasn't fond of the price, I came upon a watercooled version selling for MSRP of the base card. Literally US$699 for a version with an AIO installed.

With all of the coin mess going on at that time, I grabbed one and didn't look back!


Today I'd like to go above 2560x1440 and have tossed around various alternatives from the 21:9 megawides to the 48" LG OLED, but nothing seems to fit 'quite right', on top of no GPU worth replacing the 1080Ti with being actually available at a price that doesn't also include an at-home vivisection :D
 
I'd felt a little sorry that I grabbed a 970 before the 980Ti hit. I grabbed a second 970 instead, which at the time worked pretty well, but man... I was ready to just run one card again.
I jumped on the 980 before the Ti hit. I don’t feel bad, I was coming up from a release HD 6970, so it was a needed upgrade and compared to everything that came before it Maxwell looked great. But yeah, Pascal made it look lackluster. I am an every-other (or every every other) generation upgrader, so I don’t exactly feel regret for skipping Pascal. Ampere was lackluster for the price to me, and AMD dropped off the radar entirely last generation.
So yeah, now I’m stuck. I don’t exactly feel regret for not having upgraded before, but yeah... hind site and all I would definitely had jumped on a 1080 or a Turing card when they were available.
 
I jumped on the 980 before the Ti hit. I don’t feel bad, I was coming up from a release HD 6970, so it was a needed upgrade and compared to everything that came before it Maxwell looked great. But yeah, Pascal made it look lackluster. I am an every-other (or every every other) generation upgrader, so I don’t exactly feel regret for skipping Pascal. Ampere was lackluster for the price to me, and AMD dropped off the radar entirely last generation.
So yeah, now I’m stuck. I don’t exactly feel regret for not having upgraded before, but yeah... hind site and all I would definitely had jumped on a 1080 or a Turing card when they were available.
If it makes you feel better, I managed to pick up a used 1660Ti through a deal and had intended to send it to my younger brother; but since I've been testing it in my desktop and just not had the time to yank it and send it, I'm actually thinking about sending him my 1080Ti instead.

Main reason being that when I would get around to replacing the 1080Ti, it being a power-hungry unit with an AIO and outdated technology (video outputs and old NVENC transcoding block), I honestly can't imagine what I'd put it in. The 1660Ti, however, I can, and for the one game that I have time to care to play right now (BF4), I don't even have to dial the settings halfway back to potato to get >100FPS at 1440p.

I'm literally flabbergasted that right now I feel I can even live with that. It should feel wrong, but given that there's no 'up' to go to, I figure I can wait out this period of market silliness.
 
Enjoyed the review. As for best card for the price, MSRP means nothing today so if one has some options it could be any of the cards tested. If a 3070 is available and cheaper than a 6700 XT which is also available then of course the 3070. If the 3060Ti is costing more than the 6700 XT . . .

It is correct that AMD arch RT will probably have to be coded more for it, for example DXR 1.1 allows multiple shaders/compute in a mega shader -> for AMD Infinity Cache that would be very effective while for Nvidia it might not make much of a difference. Once RNDA2 coded optimize RT is done, comparing them should reflect better that feature performance.

As for AMD FSR vs DLSS -> AMD has not delivered yet, not known of the quality/performance benefits between them. So one cannot decide well if AMD methods will be more or less useful, if they even become available and are used in games. With Nvidia, DLSS is here, can work good, while I've have had issues with it in every title except COD, some or most do not. Titles using DLSS is increasing more rapidly now.

I think AMD did a good job on the card, as for pricing it will be the real street price that will make this card a good buy or not. Which also means availability and a earnest decent supply of GPUs to support the market.
 
Gonna pick on you a bit, but not because you're wrong :)

It is correct that AMD arch RT will probably have to be coded more for it, for example DXR 1.1 allows multiple shaders/compute in a mega shader -> for AMD Infinity Cache that would be very effective while for Nvidia it might not make much of a difference. Once RNDA2 coded optimize RT is done, comparing them should reflect better that feature performance.

This is true, but also painfully so: AMD puts out good GPU hardware, many times even beastly and sometimes clearly superior, but almost always before software can take advantage of it. That isn't to say that they don't put out GPUs that aren't the best option in certain price brackets or for certain gaming and compute workloads, but rather, that there's almost always a lag between the hardware release and the ability to fully utilize it.

I should qualify the above a little bit too. First and most important, I don't want to see AMD stop innovating, rather much the contrary. I believe that it's important to highlight where they innovate, full stop. Not every innovation bears fruit and that's only loosely related to the efficacy of the innovation itself; in technology there are so many related variables ranging from the basic engineering needed to exploit an innovation all the way up to marketing and beyond to politics! We can only fairly judge innovators like AMD and their innovations for the part that they're actually responsible for.

With that out of the way, the 6700 XT must be judged as it performs at release. Not just so that it is related to its peers but also as a benchmark for future performance gains! If we keep in mind that current markets are volatile, whether or not a buyer finds the 6700 XT to check the most boxes for their usecase is going to fluctuate day to day. We can't expect reviewers to nail that down, but rather, to simply provide the most transparent rundown that they can.

Now, with respect to RT specifically, while AMD does absolutely deserve praise for producing effective RT hardware, they're simply not the best choice where RT performance is concerned. That can change as I noted above, but right now, AMD presents a better solution for non-RT usecases than for RT usecases.

As for AMD FSR vs DLSS -> AMD has not delivered yet, not known of the quality/performance benefits between them. So one cannot decide well if AMD methods will be more or less useful, if they even become available and are used in games. With Nvidia, DLSS is here, can work good, while I've have had issues with it in every title except COD, some or most do not. Titles using DLSS is increasing more rapidly now.

I'm critical of DLSS, not specifically for the technology that Nvidia has developed nor the functional equivalent that AMD claims to be developing in response, but of the basic premise of the technology itself, in that it purports to use 'learned' techniques in order to produce more detail from less detail. As you note that you've personally had issues, I'm personally surprised that there aren't more examples of undesirable artifacts.

On the one hand I can understand a bit of 'honeymoon syndrome' where users are elated to simply have games running better than they otherwise would have, and this is the feeling that I share; on the other hand, the photographer in me has seen this approach to producing detail before, and I'm waiting for complaints from professional gaming communities as an example, where DLSS and similar technologies are either producing detail that isn't really present that is distracting, or worse, missing detail that should be there, and in both cases representing a competitive disadvantage one way or the other!

Outside of the concerns, though, it is also fair to say that DLSS is established and broadly works to its marketed purpose, and that AMDs competing technology is not only not established, but in being so cannot be rated as working or not, and that while AMD does have Nvidia's previous work on getting DLSS to an effective state to reference, having witnessed the evolution of DLSS ourselves, AMD is certainly going to have to put in a lot of effort to catch up!
 
Become a Patron!
Back
Top