Welp.
I guess I won't be buying any AMD GPU's for the next few years.
Long historical background follows, but it makes a point in the end , I promise.
__________________________________________________
I've been a serial early adopter of new higher resolutions for almost 20 years now.
In 2004-2005 some time (can't remember) I bought a 1920x1200 (Dell 2405FPOW I think?) screen back when that was still exotic. I quickly found that my poor Geforce 6800GT could not handle anything at that resolution. The result? I just kind of stopped playing games for 5 years.
Then in 2009-2010 some time (can't remember) I slowly started getting into games again, first only Civ4, but then I started catching up on the titles I had missed. And in 2010 I made the same mistake again. I bought a 30" 2560x1600 Dell U3011 and then proceeded to chase acceptable framerates for the next 3 years,
First I played some older titles using a fanless Radeon HD 5750, then I won the silicon lottery when I bought a GX470 that overclocked beyond 480 speeds, I briefly used a GTX580, but was using a custom form factor SFF case at the time, and couldn't get a large enough PSU to make it reliable, so I had to back off and quickly sold it.
Having realized the drawbacks of SFF (at least the custom varieties as they existed at the time) I decided to move back to full sized desktops. I got an AMD FX990 and stuck a "wait for the amazing new bulldozer and drop in upgrade later" Phenom II 1090T in it.
IN this case I started with a single Radeon HD 6970 (monstrous for the time triple slot Asus DirectCUII version) and when that wasn't enough, added a second one in Crossfire.
(Note the ultra rare CoolIT EPIC 180mm AIO I convinced MainGear to sell me. They later decided they could only sell them in complete systems, due maybe to licensing limitations? I don't know. I think I'm the only person who got a standalone one. It was the most AIO capacity you could get at the time. More than the dual 120mm Corsair H100. Between the 180mm fans and AIO and the huge triple slot GPU's, scale is really thrown off here.)
Whole system for good measure:
But dual Radeon HD 6970's in Crossfire were also insufficient for 2560x1600. I quickly found that Crossfire was not all that great. Scaling was poor, and minimum framerates were even worse. I theorized that having a single faster GPU would be better, so as soon as the Radeon HD7970 launched in December 2011 I bought one.
And I was right, it was indeed a better experience than the dual 6970's. But it still wasn't fast enough.
I decided to try to do some custom cooling by mounting a corsair AIO to the 7970, but I slipped with my screwdriver and killed it by cutting a trace
I then went without a GPU for a while (I actually picked up a basic GTX 460 768MB version to hold me over short term) and then when the GTX 680 launched in 2012, I jumped on it at launch.
It was even better than the 7970 had been, but it still just wasn't enough to keep up with the 2560x1600 screen.
I
finally was able to get acceptable frame rates at 2560x1600 in 2013 when the Kepler 6GB Titan launched and I bought one at launch.
Of course, that only lasted for a few months, before newer titles challenged even the highly overclocked Titan at 2560x1600.
...and then in 2015 I proved that I never learn my lesson, and once again became a high resolution early adopter when I moved to 4k by getting a 48" Samsung JS9000 TV, and started the process all over again.
I needed the relatively new HDMI 2.0 standard to be able to display 4k without dropping Chroma so it was time for more GPU upgrades. First one 980ti (I knew that wasn't going to be enough, but I could only find one in stock at the time). Then a second 980 ti in SLI
Having done multi-GPU in the past and hated it, I guess I had thought that SLI must be better than Crossfire. It wasn't. I still got disappointing scaling and terrible minimum framerates.
So, when the Pascal Titan X launched a year later, I jumped on it. I also decided to maximize my ability to overclock it by building a custom water loop. I was going to get acceptable framerates at 4k come hell or high water (literally)
And it was a success.
The JS9000 did not have any kind of FreeSync/G-Sync/VRR (as TV's didn't at the time) but I was finally able to v-sync it at 60hz and never have the framerate drop below 60fps.
At least for a little while.
As new titles came out, it took increasingly extreme measures.
By the time I got around to playing Deus Ex Mankind Divided in late 2016 I was already using some serious tricks to get smooth framerates.
Ultra settings had me hovering between 38 and 45fps.
I created a custom 21:9 ultrawide resolution (3840x1646) and played letterboxed. This boosted framerates significantly over 4k, but still wasn't enough to play it at 60hz v-synced with even a little MSAA.
Then I had a lightbulb moment. This is a TV. Here in the U.S. that means 60hz, but in Europe that means 50hz. Maybe it has a 50hz mode? Indeed it did. So I was able to play Deus Ex v-synced at 3840x1646@50hz.
It wasn't ideal, but it got me through the game.
That title was one of the heaviest I played for a while. I'd use the letterbox + 50hz trick a few more times over the next couple of years, but I didn't have to do it frequently.
Then in 2019 I bought an Asus XG438Q 4k screen that had FreeSync2 and Nvidia had recently decided to support FreeSync via "G-Sync Compatible" so it worked, so I didn't have to bother with v-sync anymore, which made things easier. Still, occasionally letterboxed it though.
I had wanted to upgrade for more performance already when the 2080ti came out in 2018, but first I was frustrated by my inability to find one in stock, and then by the time I was able to get my hands on one, the whole "RTX Space Invaders" thing was blowing up, so I decided against it. By the time RTX Space Invaders was blowing over, it felt too late in the product cycle to dump big money on a top end GPU, so I decided to wait for next gen.
Then the pandemic hit, the RTX 3000 series hit, and it was scaled to high heavens. I refused to over-pay for a GPU, so I passed on the 3090 and 3090ti as well.
Eventually I had played through my backlog of old titles, and the old Pascal Titan (now 5 years old) was really not cutting it for anything newer. I sucked up my pride and went for an overpriced XTX 6900xt Speedster ZERO EKWB edition. The thing was a champ, and with a custom power design and highly binned XTH chip I **** near hit some overclocking records on it.
I was finally playing new titles at full resolution again, with one exception. If you turned on RT it turned into a slide show.
So a few months later I cut my losses, and picked up a 4090. Not exactly at launch, but pretty close thereafter, once I was able to get my hands on one.
I'm still occasionally struggling to get adequate framerates at 4k Ultra settings (without enabling bullshit like scaling and frame generation) and this bugs me.
__________________________________________________
Aaaanyway...
This was just the really longwinded way for me to say, I have been struggling to get what I consider acceptable performance out of even the highest end GPU's since ~2005.
Since AMD hasn't been competitive at the high end for most of that time, with a few notable exceptions when I tired the SLI 6970's and 7970 (~2010-2012) and again with the 6900xt (2021) this has completely precluded AMD as an option at all for me.
If they are opting to not participate in the high end again, I guess that removes AMD as an option at all for me on the GPU side for the next few years, and will probably result in even less competition at the high end, and thus even worse pricing for the GPU's I "need".
And that sucks.
I don't suppose Intel will dream up an Arc that challenges a future 5090?