- Jul 2, 2019
For us who care:
Sharp text and pics mean more to me than framerates, and IPS 4K delivers on that.. I use the computer more for research (web browsing) and spreadsheets than I do gaming, and what gaming I do tend to be older (MMO/ARPG) games that aren't exactly very taxing on any system.I'll be honest Brain_B, I don't understand you at all. At 4K, you want the fastest card money can buy.
I have an RTX 2080 Super now for my home system, and I had to dial down my settings from max to 5 on the slider. Had FPS values in the 80's when soloing, but in raids it drops down to 30-40 FPS... I am wondering if its the game being poorly optimized in general. These values are also with RTX off for the game (no ray tracing).
I'll be honest Brain_B, I don't understand you at all. At 4K, you want the fastest card money can buy. For the life of me I can't see why anyone would step up to 4K if they weren't ready to spend a grand on video card/s (or more) from time to time. High resolution gaming has always demanded two cards in SLI until recently, where that wasn't an option. The RTX 2080 Ti was never an amazing 4K gaming card, but it was the best we had until the 30 series launched. The RTX 3080 is a great budget option for 4K, even if it isn't ideal. (It really should have more vRAM.)
Really, the waiting game makes no sense unless you are within a few months (less than six) to weeks of a major architecture's release. The RTX 3080, 3090, 6800XT and 6900XT are all "real 4K" capable cards. As graphics cards advance, so do games. Even if there aren't titles you want right now that support ray tracing or DLSS 2.0, that's not to say there won't be during the service life of the card. Also, if you keep a card 2-3 years, then going as big as you can will get you the best longevity out of it.
Really, a GTX 980 is pretty awful for 4K and always was. The Maxwell architecture actually did SLI pretty well and a pair of Maxwell Titan X's or GTX 980 Ti's weren't exactly good at 4K back then. A 3070 is at least roughly an RTX 2080 Ti equivalent, so there is that. Again, do what you want but 4K is and never was the realm of budget conscious gaming. I've been at 4K for over 4 years. Before that, I was at 7680x1600 for several years. That's worse than 4K. Pretty much since the first 2560x1600 monitors came out, I've been doing high resolution gaming. I've had to buy GPU's in pairs to do that until the RTX 2080 Ti. It wasn't that the RTX 2080 Ti was good enough to be on its own, but rather the fact that my GTX 1080 Ti's rarely worked in SLI and the RTX 2080 Ti's didn't either. They were supported in so few titles, that going with the most powerful single-GPU (and the newest) was the only way forward.
On a budget, the RTX 3070 isn't too bad. It's roughly equivalent to an RTX 2080 Ti. It will struggle with 60FPS in some titles, but it can easily do it in others. Turn one or two settings down in modern games and you'll be good to go. Even Cyberpunk 2077 is playable using DLSS 2.0 and ray tracing on an RTX 2080 Ti at 4K. Albeit, DLSS needs to be on performance or balanced to do it. The RTX 3080, 6800XT/6900XT from AMD are better choices by a mile, but they are considerably more expensive. The RTX 3090 is a relatively poor value compared to the other cards mentioned, but its certainly the best 4K card on the market bar none. I considered the RTX 3090 to be a bargain, not because of what else is out there but because of what I'm used to paying for GPU's. I have been paying $1,500+ for years buying two cards. I've also paid as much as $3,000 for a pair of Titan X's in SLI. By that metric, an RTX 3090 is a no brainer for the best 4K gaming available.