BTW thanks a lot for using settings one would actually use for gaming. Not Low/mid 720p/1080p settings like many reviews out there.
I think both have value.
Using minimum settings and low resolution you see the
potential performance a CPU might have if you have a really fast GPU.
This is valuable for theoretical comparisons and to get an idea how "future proof" the CPU is.
Benchmarking a CPU at real world settings is easier to understand for those who don't get the fact that they will in all likelihood be GPU limited, and helps minimize the arguing over irrelevant performance differences no one will ever actually experience in their CPU due to this GPU limitation.
Honestly, if I had to pick just one, I'd pick the CPU benchmark at minimum settings and low resolution. Using that and GPU benchmarks I can project what my performance will be in almost any combination of hardware. if I only have the latter, I can only really predict how it will perform in that one CPU/GPU combination.
So yeah. I like to see both, but if I had to pick just one, I'd pick the "everything graphical at minimum settings and resolution using an overkill top end GPU" just to try to isolate the CPU performance, because that is honestly more useful.
If I have that CPU isolated data I can look at a CPU benchmark and at a GPU benchmark separately and just pick the minimum framerate of the two, and know that is likely a reasonable prediction of how they will perform together. This is of much greater value to me than real world benchmarks on a system with a GPU I may or may not ever use.