From the intro:
"...but these sort of commenters seem unaware that one person's 4K is not another person's 4K. For example, if you require a minimum of 60 FPS like some single-player gaming, for the most part you need not about CPU performance but if you game at 4K and do require 144 FPS you almost certainly will want the best CPU you can afford. In either case you can work out the true capability at a resolution such as 1080p where you simply can't at 4K. Also in terms of real world performance most gamers who game at 4K don't do so at the native resolution rather they enable FSR, DLSS, or XESS using either the performance, balanced, or quality preset."
I absolutely know that I'm in the single-digit percentage of those PC gamers pushing a 4090 to its max on a regular basis for native 4K but when it can't hold 90-120 FPS (LG C2) or 100-135 FPS (TCL M85), I will use DLSS Quality to get there. I don't believe anyone disagrees that 1080P will provide the ultimate CPU bottleneck and it is basic math to reason the GPU bottleneck that happens at 4K. However, since I've been gaming at 4K since my GTX 970 SLI days, with 780 for PhysX, then GTX 1080 SLI>2080 Ti>3090 Ti>4090 & 4080 SUPER, I have seen how CPUs can be bottlenecked as they get older. I don't usually check frametimes and for decades judged my system's performance simply by FPS only to experience total shock of how much smoother things got when I upgraded my CPU, even at 1080p while the FPS didn't change much. Going from 2600K to 4930K, day and night experience in terms of improvement, then again from 4930K to 3700X, and then 5800X3D.
I guess one thing that doesn't regularly get covered in many of the 9800X3D reviews, is comparing to older gens. I even remember seeing this happen with some of the 7800X3D reviews. Basically a lot of comparisons for the current and the last gen but nothing beyond where as the question comes up how often are PC owners going to upgrade? The 5800X3D is just over 2 1/2 years old, still widely spoken about, but not included in 9800X3D comparisons while the 7800X3D is only a year younger. But should folks be upgrading their CPU every year or so because reviewers drop the older gen and merely focus on the most recent which often shows negligible 4K gains over the previous? I don't believe so but the lack of inclusion in review data almost seems to suggest a line of thought that if your processor isn't in the list then it's time to upgrade.
I think most who have a 7800X3D, 12900K or newer, don't have any intention of getting a 9800X3D while those with older processors, 5000 series or older, 10900K or older, are very much at the point of considering it and are the ones thinking about the performance differences at 4K as they either upgrade to, or are simply jumping in to, 4K gaming.
I also see two sides of this for reviewers. It's pretty much a given that you have to have at least a 4080 SUPER/7900XT to really get your foot in the door for native4K gaming at high FPS, and even then a very tempered approach if it involves ray tracing, and according to surveys left and right the amount of users with such systems are not in double-digit territory and only rarely just above. You more or less have to have a 4090 or 7900XTX to truly push that envelope. That being said, also how many reviews are using said GPUs? Brent does and thanks to the powers that be so we know he's giving us solid #s. I've also learned for years from him and
@David_Schroth,
@Dan_D , and Kyle on how to overclock my GPUs so I've been pushing to the max without custom loops for a while now (only got into AIO models in recent years), but again, with all the folks asking about 4K gaming, how many really have the hardware to make the difference.