AMD Announces Radeon RX 6900 XT ($999), Radeon RX 6800 XT ($649), and Radeon RX 6800 ($579)

IMO, the RX 6900XT should have been labeled the RX 6900 with a priced tagged at $729 - $799 maxed considering that AMD does not have an DLSS challenger, etc.
 
This is just my humble opinion, though.

NVIDIA screwed up the plans for AMD's GPU listing with NVIDIA's introduction of the RTX 3090.

I remember reading early rumors of the development of the RTX 30 series that the RTX 3060 "supposedly" has been as fast or faster than the RTX 2080Ti. Not the RTX 3070.

I think AMD also went on that speculation as well until Jensen threw his curved ball.

This is what I think AMD had planned at first:

3080 (3090) vs 6900 (6900XT)
3070 (3080) vs 6800 (6800XT)
3060 (3070) vs 6700 (6800)

And later on in 2021, AMD probably had thought that NVIDIA would've introduced a 3080Ti/3090 challenger but not at initial launch.
 
This is just my humble opinion, though.

NVIDIA screwed up the plans for AMD's GPU listing with NVIDIA's introduction of the RTX 3090.

I remember reading early rumors of the development of the RTX 30 series that the RTX 3060 "supposedly" has been as fast or faster than the RTX 2080Ti. Not the RTX 3070.

I think AMD also went on that speculation as well until Jensen threw his curved ball.

This is what I think AMD had planned at first:

3080 (3090) vs 6900 (6900XT)
3070 (3080) vs 6800 (6800XT)
3060 (3070) vs 6700 (6800)

And later on in 2021, AMD probably had thought that NVIDIA would've introduced a 3080Ti/3090 challenger but not at initial launch.
 
According to Gamer's Nexus, most if not all current DXR implementations aren't compatible.
They are idiots if they said that, otherwise maybe it was just worded poorly. DXR is direct x raytracing... it's a standard interface; anything that is compatible will work without code changes. Will it be optimal? possibly not, but it should work. That said, the *implementation* from AMD is obviously going to be different than NVIDIA, because the implementation is in hardware and in the drivers.
 
It's not even necessarily that there's a game out right now, which would definitely be a motivator, but also that it doesn't make much sense to spend this much on a GPU without good RT. Especially for those of us that keep our GPUs for three or four years.
Yup, same that you guys have been saying for the 2000 series... buy it for the future, that never happened, so now buy the 3000 for the future that may or my not happen. Anyways, I don't disagree that RT is the future at some point, but it's still not there yet. I am curious to see how good/bad the RT performance is, but I'm not basing my entire buying decision on it. Honestly, the part of RT that I'm more interested in is speeding up my blender rendering jobs, which NVIDIA does well at with their RT hardware. I'm hoping AMD support makes its way in at some point to compare, although even without hardware RT, AMDs raw compute power does really well, just not as well ;).

IMO, the RX 6900XT should have been labeled the RX 6900 with a priced tagged at $729 - $799 maxed considering that AMD does not have an DLSS challenger, etc.
So, it keeps pace/outruns the 3090 which is "appropriately" priced at $1500 and you think AMD should charge $730? Lol, dude, go back to NV headquarters and ask for a raise man, that's great. Yeah, not even going to offer this a real response, you're BIAS and/or marketing attempt isn't fooling anyone with half a brain.
 
This is just my humble opinion, though.

NVIDIA screwed up the plans for AMD's GPU listing with NVIDIA's introduction of the RTX 3090.

I remember reading early rumors of the development of the RTX 30 series that the RTX 3060 "supposedly" has been as fast or faster than the RTX 2080Ti. Not the RTX 3070.

I think AMD also went on that speculation as well until Jensen threw his curved ball.

This is what I think AMD had planned at first:

3080 (3090) vs 6900 (6900XT)
3070 (3080) vs 6800 (6800XT)
3060 (3070) vs 6700 (6800)

And later on in 2021, AMD probably had thought that NVIDIA would've introduced a 3080Ti/3090 challenger but not at initial launch.
More like AMD screwed up NVIDIAs plans, caused them to rush their release and cause this cluster of a release. I don't think NVIDIA was planning for AMD to catch up. Your humble opinion is based in nothing. It looks like AMD's cards all lined up pretty well 6900xt/3090, 6800xt/3080.. the only one off was the 6800 beats the 3070 more than 3-4%, so they ended up pricing that one higher. NVidia was caught out that AMD might have a faster card and rushed their release with no stock and pissed of a bunch of people (most of who are just whiners who are going to buy the cards anyways). Nothing about this situation feels like AMDs plans got messed up... feels quite the opposite.
 
What has me concerned is the questionable math that many commentators / reviewers are not addressing the elephant in the room.

60 CUs - $579
72 CUs - $649
80 CUs - $999! For the same VRAM, etc. compared with RX 6800 XT!

So, for an extra 12 compute units, only an added $70!!

But, for an extra 8!! compute units, an extra $350?!

Why isn't this poor (gouging) marketing HIKE being discussed?
I don't know, why are you completely ignoring the fact that NVIDIA is charging $700 for 10% performance but find it unacceptable that AMD charges $350 for about the same? Just an extra 8 CU's, which means the DIE has to be 100% perfectly binned to perform at this level. That costs money and causes them to be more rare... a full die is ALWAYS going to have a premium, whether it's from amd, nvidia or intel. Heck, nvidia's top tier 3090 isn't even a full die because they couldn't get binning good enough. Imagine what they'd have to actually charge for 100% full working dies? Would I like to pay less for more... of course, I'm a consumer. Do I think $999 for 3090 performance is unreasonable? Well, given the 3090 is a $1500 card... comparatively, it's a good deal. It's not an elephant in the room. Why was the 2080ti so much more than a 2080? Why is intels 10900k more than a 10700k? Why is the 3950x so much morethan a 3900x? Full dies and highest end/niche market. I'm not sure why this is an elephant in the room but you completely ignore the fact that every company does the same thing, except AMD's prices are lower in this instance.

You can pay AMD $649 or NVIDIA $699 for about the same performance.
You can pay AMD $999 or NVIDIA $1499 for about the same performance.

The only elephant in the room is what is nvidia going to do now that they can't claim the performance, power efficiency or price crown? Besides the obvious sending out shills to random forums to post misc crap?
 
The real problem Nvidia and Intel have is not AMDs products... Its the fact that Lisa Su has turned them into a relentless competidor.
Hopefully she is training one or more in her management skills.
 
The real problem Nvidia and Intel have is not AMDs products... Its the fact that Lisa Su has turned them into a relentless competidor.
Hopefully she is training one or more in her management skills.
Without disagreeing and with no disrespect toward Dr. Su, AMDs success is entirely hitched to TSMCs production capabilities. While TSMC seems to be charging ahead at the moment, note that they've stumbled more than Intel over the years, and if their 7nm node hadn't been 'all that', neither would a single AMD product been, regardless of AMD management.
 
AMD is introducing the proprietary " Smart Access Memory ".
You need all out AMD GPU, CPU etc to get the advantages of this.

Everybody cool with this?
If this was Intel, or Nvidia..

lm.gif
 
AMD is introducing the proprietary " Smart Access Memory ".
You need all out AMD GPU, CPU etc to get the advantages of this.

Everybody cool with this?
If this was Intel, or Nvidia..
AMD always likes to have 'that one thing' that supposedly makes their product line unique... that never gets used in the debut products. They've been doing it since they were ATi. I just assume let them have their BS marketing point and move on, because if the technology is actually going to take off, it's going to be called something else, usually be incompatible with the first iteration, and be implemented by other vendors by the time it's actually put to use.
 
AMD is introducing the proprietary " Smart Access Memory ".
You need all out AMD GPU, CPU etc to get the advantages of this.

Everybody cool with this?
If this was Intel, or Nvidia..

It's only an issue if they force the technology to stay within' AMDs hardware. This is the first iteration of it. You can't really base it off that as Nvidia nor Intel know anything about it. If the technology proves to be significant and AMD keeps the competitors from using it then grab the pitch forks.
 
AMD is introducing the proprietary " Smart Access Memory ".
You need all out AMD GPU, CPU etc to get the advantages of this.

Everybody cool with this?
If this was Intel, or Nvidia..

I think it's a meaningless marketing bulletpoint. Apart from that, it's not like nVidia has a shortage of proprietary technology... so yeah.
 
It's only an issue if they force the technology to stay within' AMDs hardware. This is the first iteration of it. You can't really base it off that as Nvidia nor Intel know anything about it. If the technology proves to be significant and AMD keeps the competitors from using it then grab the pitch forks.

Why? Even if it is proprietary with AMD architecture it's not like Nvidia and Intel haven't done the same in the past. G-Sync was, and still is, an Nvidia exclusive. AMD had to make their own iteration with FreeSync.

IMO, AMD would be smart to keep this to itself, especially if it grows in to something that offers significant performance gains as the tech advances.
 
Here's my thought on the releases (both nVidia and AMD)

I was impressed by Ampere, I won't lie. Coming from Turing where the entire emphasis was on RT, and rasterization got sidelined, I was glad to see them come back and address that rasterization performance is still important. I do think they probably pushed the TDP envolope a bit too far, 300W+ on a PCI card is ... a lot, but we'll see how it plays out with longevity.

Great to see AMD come back swinging. Even if they don't line up with RT performance, for right now, all of the value is still wrapped up in rasterization performance. RT is still in the not required eye candy category for me - like Leroy said, I see it almost in the same light as I do GPU-accelerated PhysX: it was nice, and it looked great, but it didn't really evolve into anything past eye candy. Maybe RT will eventually, but it won't happen inside the lifespan of this generation of cards, so I don't see it has a priority for me.

The downside: These are great for people looking to spend $500+. We still haven't got replacements for the 5700, or the 2060, or the 1660. I also find the pricing on the 6800 - 6800XT to be curious - they left the 3070 a niche to breathe in, and it's still at a fairly attractive price/performance point for nVidia, but they were very aggressive on the upper tiers. In my mind, anything about $500 is still high end halo territory - even though I could afford it, I'm unwillling to simply because the technology turns over so fast I don't feel like it's worth the investment.

Still need to see real world reviews, I wouldn't say AMD knocked it out of the park, but I don't think they embarrassed themselves like they have with previous releases (Fury), and appear to be standing more or less toe to toe with nVidia for the first time in a long time.

Personally, $500 is on the upper end of what I would spend, so the 3070 still looks attractive, if only by process of elimination. I'm more interested to see what trickles down in the $300-$500 range, whenever we may see that. I'm looking to buy a new card before Thanksgiving, so it may just be whatever I can get my hands on... maybe I can snag a good deal on a 2060 or 5700, that would actually be ok, although I haven't really seen any great deals on anything.

I did notice 3070 inventory didn't exactly last long, but seeing that some people here actually were able to snag a card, maybe there is hope on that front. I think AMD is going to find themselves in a similar boat come Nov 18 if reviews confirm what we are seeing from AMD. And, of course, if AMD/people can get over the old DRIVERS issues/excuses
 
IMO, the RX 6900XT should have been labeled the RX 6900 with a priced tagged at $729 - $799 maxed considering that AMD does not have an DLSS challenger, etc.

I was expecting this after nVidia priced the 3080 pretty low at $700

AMD's plan is to reserve the $700 to $900 gap for the custom 6800 XTs from board partners with unlimited power & increased boost/game clocks
 
So nothing under $500 from AMD. Am I the only one who finds that interesting?

I was sort of expecting this.

Remember that all of 6800, 6800 XT, & 6900 XT are derived from the same 80 CU Navi 21 chip

AMD has a 40 CU chip (to be released next year) with extreme boost speeds of upto 2.5ghz

Lets call this the 6700 XTX
Remember that AMD originally attempted to price the 5700 XTX at $450 inspite of the fact the chip size is approximately same as the 470 that was being discounted as low as $130

So the plan is to sell the (12 GB, 40 CU, 2.5ghz) 6700 XTX at $500 and match/beat the performance of RTX 3070.
 
Become a Patron!
Back
Top