This isn't entirely true. I've tested it multiple times, and at higher resolutions CPU performance absolutely impacts your frame rates in games. While averages are largely unaffected, minimum frame rates and even frame times are effected to a degree. As I've stated before, my AMD Threadripper 2920X with a 4.2GHz all core overclock would drop to 26FPS in Destiny 2 at 3840x2160. In contrast, the Core i9 9900K @ 5.0GHz never dropped below 56FPS. This isn't to say that the 2920X system wasn't playable, but it routinely dropped into the 45FPS realm which was noticeable in firefights. Conversely, the 9900K @ 5.0GHz's drop to 56FPS was so rare that it only showed up in benchmarks.
This is in the same system with the same hardware with only the motherboard and CPU being changed. The 9900K provided a smooth and enjoyable gaming experience while the Threadripper 2920X simply didn't. With PBO on, the 2920X achieved a minimum of 36FPS which was better, but it didn't impact the overall experience as much as I'd hoped. I was perfectly happy with my Threadripper 2920X outside of gaming, but as a gaming CPU it was lacking for sure. At 1920x1080, I might not have noticed as much. At 4K and later 3440x1440, I absolutely found the CPU inadequate as it couldn't deliver the experience I desired.
Now, having said that, I'm not sure what kind of impact we would see from the 10900K over something like a 9900K.
That is really strange. Almost makes me wonder if something else was going on with the test system, related to chipset drivers, or GPU driver compatibility. (Did the Threadripper 2xxx have that same infinity fabric issue with differences in latency and between different chips that the original Threadripper had? I can't remember. I know mine doesn't suffer from this issue, but I can't remember if they fixed that in the 2xxx series or in the 3xxx series)
In all of my own testing across the countless systems I have built over th eyears, with different configurations for both myself and others, I have never seen a single instance where added resolution (or any GPU setting at all for that matter) had any impact on CPU load.
The two have always been 100% independent from each other in every test I have ever run. You know like this:
Test a GPU with the fastest CPU on the market to minimize CPU impacts. Note performance.
Test a CPU with minimal GPU settings in order to minimize GPU effects on the numbers.
Now, when you combine the two in the same system, the performance you get is the minimum of the two. On a functioning system, at a fixed frame rate, I've never seen CPU be impacted by any graphics settings, including resolution. The only time I've seen any CPU impact from changing graphics settings is once you turn the graphics settings up, and you become GPU limited, which thus lowers the overall system framerate, and thus puts less load on the CPU.
Now I've never played (or even heard of) the Destiny series, but since getting my Threadripper 3960x I have run several titles, all of them at 4k, including Far Cry 5, Far Cry New Dawn, and the Outer Worlds. My GPU has been brought to its knees at 4k in all of these titles, but I have never seen any indication that the CPU was ever inadequate.
I really can't help but suspect you uncovered either a driver bug of some sort or that this was caused by the Infinity Fabric issue which was well known with early Threadrippers.