AMD RDNA2 Announcement!

I stumbled onto that as it first kicked off and I have to admit, I'm really impressed. A 3080 equivalent at $650 and a 3090 equivalent at #1000? $580 for the 6800 seemed a little high but according to their benchmarks it does look to be a little faster than the 3070 but I still think $580 is a little high.

Either way these products look really impressive. No wonder Nvidia had to rush out their 3000 series launch cause it looks like AMD might be on top this generation.

Gotta hand it to Lisa Su. She takes over a company that was headed to rock bottom and most people had them figured for bankruptcy or outright total collapse but in 6 years as CEO she's not only turned the company around but it looks like she's put them on top of the CPU AND the GPU world.
 
The real question for me is: @Dan_D going to continue his hunt for a RTX 3090 or switch to a RX 6900 XT?
 
****, now we won't get a RTX3070 price drop :mad:

But for an extra 80 bucks the 6800 seems like a much better choice for 4K gaming.
 
I'm still in the Nvidia camp cause I paid too much for this G-sync monitor and it really is a superb monitor so I don't want to get rid of it. Hopefully Nvidia adjusts their prices a little to get more in line with AMD's cards so it'll be less painful for me to buy one but I ain't holding my breath cause Nvidia seems to never lower their prices.
 
while we still have to wait for actual benchmarks, its quite impressive.

AMD took its best shot ever and it seems like it delivered.

I'm intrigued by the super resolution technology, could it be a DLSS response?

I'm still targeting for a RTX3070 for christmas, maybe by then there we will have nvidia's response.
Gotta say its been quite a while since nvidia had to play catchup.
 
I'm gonna hold off all the excitement on this release announcement until I see Brent get his hands on some product, test it, and see EXACTLY what these things can or can't do.

If AMD nailed this release, and product is available, nVidia is in trouble..
 
BTW, contrary to nvidia, AMD has touted its overclocking capabilities, so expect the 6800 to reach past 6800XT clocks.

How things change, I would say that AMD pulled a Pascal while nvidia pulled a VegaVII. :LOL: :LOL: :D:D
 
Can't watch the videos from work and planned on that after work but from everyone's discussion, it sounds pretty promising
 
The real question for me is: @Dan_D going to continue his hunt for a RTX 3090 or switch to a RX 6900 XT?

What a good looking question.

It's actually a difficult question for me to answer. First off, AMD's claims must be verified. While AMD has been very honest on the CPU side, it hasn't built up quite as much credibility on the GPU side. That being said, the honesty didn't start until Lisa Su took over, so it might extend to the GPU side. Even if we can take AMD's numbers at face value, that's still only a small sampling of games and likely the ones that show AMD in the best light. It doesn't cover titles that I play, such as Ghost Recon Breakpoint or Destiny 2. It also doesn't cover potential titles I might or will play in the near future such as Cyberpunk 2077.

That's the first thing. Secondly, I'm in a predicament as I run a G-Sync hardware based monitor presently. Specifically, an Alienware AW3418DW. I don't believe it can do FreeSync at all. At least, not officially. That leaves me in a position of having to buy a new monitor if I go AMD. An equal or better monitor that can do FreeSync is going to be fairly expensive. I won't go with a smaller display, though I'm not hung up on ultra-wide displays. However, if AMD's numbers are correct or even if the RTX 3090 wins by a small percentage over the 6900XT, I'd still be inclined to go with AMD due to the price difference.

Now, I had planned to go to a 4K 120Hz monitor anyway. So far, I haven't liked some of my options but even if I do find something I want, actually getting it will depend on several factors. First and foremost, 2020 has been a hellish piece of **** of a year and my house has been the house from the Money Pit. (A film from the 1980's that some of you should remember.) I have some upcoming expenses coming up like possible foundation work. I will grab a new video card for sure, but whether or not I can do a monitor, CPU and video card is a long shot and its looking less likely every day.

I have to look at my total upgrade costs which would mean paying for an RTX 3090, even at full price the cheaper overall option. It's $500 more than the Radeon 6900XT, but a likely upgrade on the monitor front would put me a couple hundred dollars above that easily. NVIDIA could also lower its prices on the 3090's. Though, in my opinion I think that's doubtful. That's not what I think will actually happen.

Speculation:

I do not often engage in speculation and I've gone most of my career without commenting too much on rumors beyond debating how interesting or plausible I think they might be. I think NVIDIA was well aware of what AMD had up their sleeves and always had a response planned. In my experience, these companies are usually aware of what the others are doing and are rarely flat out blind sided. I think NVIDIA is definitely going to respond to AMD by releasing a 3080 Ti which will be be as fast or faster than the 3090. I suspect that's probably what the RTX 3080 20GB was always intended to be in the first place. The price will likely be slightly higher than the 6900XT, or the same to force the 6900XT to come down.

If that were to happen, I'd probably go that route but timing would be everything on that. Unless such a card absolutely smashes the 3090, I'd probably still opt for that as I really want to make my purchase before the end of the year.
 
while we still have to wait for actual benchmarks, its quite impressive.

AMD took its best shot ever and it seems like it delivered.

I'm intrigued by the super resolution technology, could it be a DLSS response?

I'm still targeting for a RTX3070 for christmas, maybe by then there we will have nvidia's response.
Gotta say its been quite a while since nvidia had to play catchup.
Super resolution has been in their drivers for a while... I doubt they changed how it functioned and used the same name.
 
What a good looking question.

It's actually a difficult question for me to answer. First off, AMD's claims must be verified. While AMD has been very honest on the CPU side, it hasn't built up quite as much credibility on the GPU side. That being said, the honesty didn't start until Lisa Su took over, so it might extend to the GPU side. Even if we can take AMD's numbers at face value, that's still only a small sampling of games and likely the ones that show AMD in the best light. It doesn't cover titles that I play, such as Ghost Recon Breakpoint or Destiny 2. It also doesn't cover potential titles I might or will play in the near future such as Cyberpunk 2077.

That's the first thing. Secondly, I'm in a predicament as I run a G-Sync hardware based monitor presently. Specifically, an Alienware AW3418DW. I don't believe it can do FreeSync at all. At least, not officially. That leaves me in a position of having to buy a new monitor if I go AMD. An equal or better monitor that can do FreeSync is going to be fairly expensive. I won't go with a smaller display, though I'm not hung up on ultra-wide displays. However, if AMD's numbers are correct or even if the RTX 3090 wins by a small percentage over the 6900XT, I'd still be inclined to go with AMD due to the price difference.

Now, I had planned to go to a 4K 120Hz monitor anyway. So far, I haven't liked some of my options but even if I do find something I want, actually getting it will depend on several factors. First and foremost, 2020 has been a hellish piece of **** of a year and my house has been the house from the Money Pit. (A film from the 1980's that some of you should remember.) I have some upcoming expenses coming up like possible foundation work. I will grab a new video card for sure, but whether or not I can do a monitor, CPU and video card is a long shot and its looking less likely every day.

I have to look at my total upgrade costs which would mean paying for an RTX 3090, even at full price the cheaper overall option. It's $500 more than the Radeon 6900XT, but a likely upgrade on the monitor front would put me a couple hundred dollars above that easily. NVIDIA could also lower its prices on the 3090's. Though, in my opinion I think that's doubtful. That's not what I think will actually happen.

Speculation:

I do not often engage in speculation and I've gone most of my career without commenting too much on rumors beyond debating how interesting or plausible I think they might be. I think NVIDIA was well aware of what AMD had up their sleeves and always had a response planned. In my experience, these companies are usually aware of what the others are doing and are rarely flat out blind sided. I think NVIDIA is definitely going to respond to AMD by releasing a 3080 Ti which will be be as fast or faster than the 3090. I suspect that's probably what the RTX 3080 20GB was always intended to be in the first place. The price will likely be slightly higher than the 6900XT, or the same to force the 6900XT to come down.

If that were to happen, I'd probably go that route but timing would be everything on that. Unless such a card absolutely smashes the 3090, I'd probably still opt for that as I really want to make my purchase before the end of the year.
And how would a RTX3080Ti be faster than a RTX3090? If a response to the 6900XT is to be released, it would have to be based on a full GA104, but the 3090 is already a full chip, so there's nowhere to go from there. The only option would be going 7nm, and hope for higher clocks, but that won't happen overnight.

BTW where are those rumored magic drivers that would bring gaming performance in line with compute performance?
 
BTW, contrary to nvidia, AMD has touted its overclocking capabilities, so expect the 6800 to reach past 6800XT clocks.

How things change, I would say that AMD pulled a Pascal while nvidia pulled a VegaVII. :LOL: :LOL: :D:D
It does kind of seem like that, if only because the AMD GPUs are touted to clock well above the usual 1.5GHz to 2.0GHz that GPUs have run at lately. Ampere runs in that range and boosts to the top end of it with ease.

Of course, when it comes to overclocking, since watts are frequency multiplied by voltage, power draw increases dramatically, which may put a damper on the enterprise for many users, especially if the GPUs are already pushing into their 'inefficient' zones at stock / stock boost. Remember that it's not just power draw but also localized heat and noise generation, and that cooling on AMD cards (especially the stock ones!) is generally pretty terrible, and any overclocking 'advantage' could be pretty difficult to leverage.

I'm actually about to move my desktop build to a larger case while thinking out loud to myself, 'breathe, and shut the hell up' :)
 
Become a Patron!
Back
Top