Intel Core i9-14900K vs AMD Ryzen 7 7800X3D Gaming Performance in 20 Games

Brent_Justice

Administrator
Staff member
Joined
Apr 23, 2019
Messages
931
Points
93
Introduction In this gaming performance review today, we are specifically going to compare two CPUs to go head-to-head tested in 20 games to see who the champion is now that the Intel 14th Gen CPUs from Intel have been released. The question we want to know is how the Intel Core i9-14900K and AMD Ryzen […]

See full article...
 
Hi... great article btw!

It stood out to me that every test was 1080p (max settings at least!)

Are we yet at a stage where we can include 1440p or 4k in tests like this? Because I just shudder to think of the LOE that went into making this I want to be absolutely clear. I AM NOT COMPLAINING. Just asking. Thanks!! :)
 
Hi... great article btw!

It stood out to me that every test was 1080p (max settings at least!)

Are we yet at a stage where we can include 1440p or 4k in tests like this? Because I just shudder to think of the LOE that went into making this I want to be absolutely clear. I AM NOT COMPLAINING. Just asking. Thanks!! :)

Every CPU review we have done includes 4K, 1440p, and 1080p game testing. I believe in that, and from a CPU review perspective, we do it. It's a lot of work and time to include that.

https://www.thefpsreview.com/2023/10/17/intel-core-i9-14900k-cpu-review/7/#ftoc-heading-18

However, for a 20-game round-up, it was simply too much to get done in a reasonable amount of time, plus what we are testing is purely CPU performance, and wanted to show the biggest differences. I did indicate text in the review, at the top of all the gaming pages, that 4K showed no performance differences, 1440p a little, and 1080p the most, and thus chose to use 1080p for all the benchmarks.

I also touched on the idea that if you are playing games at 4K and 1440p, this CPU performance comparison is moot, because there are small to no differences, in the conclusion. In every CPU review I've done, the data has proven over and over that testing 4K is pointless, you have a better argument with 1440p, but not 4K at this time.
 
Last edited:
Fair enough. I figured that was the case. the LOE for this review alone is bordering on 40+ hours easily. Adding another resolution to the run could increase that by 33% rather easily.
 
That was a ton of work to essentially end up with a "tie". Yikes.
 
I suspect (but don't know for sure) that a lot of what is holding the 7800x3d back is the infinity fabric, and the effective limits it places on RAM bandwidth. Intel's chips are absolutely running away with RAM benchmarks right now.

If revisiting and boosting infinity fabric in order to improve memory performance isn't on AMD's plate for Zen5, it probably should be.
 
Hi... great article btw!

It stood out to me that every test was 1080p (max settings at least!)

Are we yet at a stage where we can include 1440p or 4k in tests like this? Because I just shudder to think of the LOE that went into making this I want to be absolutely clear. I AM NOT COMPLAINING. Just asking. Thanks!! :)

Higher resolution is completely wasted on a CPU benchmark. You'll just add in GPU limitations which taint the result.

I used to be a proponent of low resolution, minimal settings for CPU benchmarks, but I have re-assessed that as of late. Some games add more NPC's and other things that can have a CPU impact when you raise the settings these days, so the best you can do for a CPU comparison is to turn up the settings, but really turn down the resolution.

Then you can look at GPU and CPU reviews independently from each other. Whichever one gets the lowest framerate in title X, that is the performance that combination will get (or at least close to it)

If CPU X gets 78fps and GPU Y gets 99FPS (at a specific resolution) in the same title, then the combination of those two will likely result in 78fps, and you will likely be CPU limited. And vice versa.

Once you start cranking up the resolution you start seeing confounding effects in the data, and then you can no longer do these kind of mix and match comparisons. I very much don't want this to happen, as there are infinite combinations of hardware. Whatever we can do to make the test results independent of each other at least results in a scenario where we can arrive at pretty good guesses how a specific CPU and GPU will perform together.

Otherwise we'd need test data from every single combination, and that's just never going to happen.
 
I also touched on the idea that if you are playing games at 4K and 1440p, this CPU performance comparison is moot, because there are small to no differences, in the conclusion. In every CPU review I've done, the data has proven over and over that testing 4K is pointless, you have a better argument with 1440p, but not 4K at this time.
This is not entirely accurate. I found out with the second generation Threadrippers that while their benchmark numbers showed they were good enough for 4K gaming, they actually weren't. The had lower lows and higher highs than other CPU's leading to a similar average but actual frametime analysis proved that they were incapable of providing a quality 4K gaming experience. At the time, only the 9900K was capable of maintaining a solid 60FPS+ framerate and none of the AMD CPU's I tested (which included the Ryzen 3000 series and Threadripper 2000 series) could do that.

Granted, these aren't the CPU's being sold today and trying to build a 4K rig out of parts that old would be a waste of time and money today, but I think the point still stands. These types of issues are why I think testing at 4K in CPU articles makes sense because internal latencies with CPU's or other failings could be an issue for gamers or professionals that might also use their workstations for gaming.

But I agree with you in that for the purpose of a 20 game roundup, 1920x1080 is what makes sense. I would tend to think that the best CPU's at 1080P are going to be the best CPU's at 1440 and 4K resolutions as well. (Data over time has usually confirmed this as well.)
 
I expected the performance gap would favor intel by a much larger margin. I mean it has more cores and a higher clock speed to begin with. Great showing from the 7800X3D
 
This is not entirely accurate. I found out with the second generation Threadrippers that while their benchmark numbers showed they were good enough for 4K gaming, they actually weren't. The had lower lows and higher highs than other CPU's leading to a similar average but actual frametime analysis proved that they were incapable of providing a quality 4K gaming experience. At the time, only the 9900K was capable of maintaining a solid 60FPS+ framerate and none of the AMD CPU's I tested (which included the Ryzen 3000 series and Threadripper 2000 series) could do that.

Granted, these aren't the CPU's being sold today and trying to build a 4K rig out of parts that old would be a waste of time and money today, but I think the point still stands. These types of issues are why I think testing at 4K in CPU articles makes sense because internal latencies with CPU's or other failings could be an issue for gamers or professionals that might also use their workstations for gaming.

But I agree with you in that for the purpose of a 20 game roundup, 1920x1080 is what makes sense. I would tend to think that the best CPU's at 1080P are going to be the best CPU's at 1440 and 4K resolutions as well. (Data over time has usually confirmed this as well.)

I disagree, since performance difference will pretty much vanish at 4k. So a faster CPU at 1080p even if the performance gap is bigger, won't translate into better performance at 4K.

That said, I prefer a real world scenario. Even though having 1440p and up will probably end up being boring.
 
I disagree, since performance difference will pretty much vanish at 4k. So a faster CPU at 1080p even if the performance gap is bigger, won't translate into better performance at 4K.
This is a possible extrapolation of 4k performance from a 1080p test, but not the only one.

Another is that interruptions in frame delivery - which affect game 'feel' - will be just as pronounced but harder to catch and document, while still affecting end-user experience.



Of course, the real story of 7800X3D vs. the world (meaning Intel, but also other AMD CPUs) is one of pure gaming tuning versus great gaming and better to far better creativity performance.

If gaming is the only concern, there's only one 'best' choice, unless there's a specific scenario that leans one way or the other versus the mean. And if content creation is a significant concern alongside gaming, then there's also really only one choice. Which is why you see folks like J2C switching back to the 14900K.

Things only really get more interesting when a budget is applied.
 
I disagree, since performance difference will pretty much vanish at 4k. So a faster CPU at 1080p even if the performance gap is bigger, won't translate into better performance at 4K.

That said, I prefer a real world scenario. Even though having 1440p and up will probably end up being boring.

That's just it. In that scenario at 4k, you are no longer benchmarking the CPU. You are benchmarking the GPU, so it is completely useless data.


Just run all CPU benchmarks at high settings and the lowest resolutions possible to see what the CPU is capable of. Then look at GPU reviews at the resolution you are interested in to see if you will be CPU or GPU limited.

You don't have to get all of your information spoon fed to you in one review. And you can't. It is impossible for reviewers to test every permutations and combinations of CPU, settings and GPU.
 
That's just it. In that scenario at 4k, you are no longer benchmarking the CPU. You are benchmarking the GPU, so it is completely useless data.
You're still benchmarking the system, which is what determines the user experience.

And you do want to have higher-resolution benchmarks to compare to, to make sure that there are no relative deviations from 1080p - though this is more important for GPUs than CPUs, of course.
 
When you throw DLSS or any other scaling software into the mix you're no longer reviewing the GPU but the driver/software features of a specific line. Might as well only compare that 'chipset's' GPU's against each other to see which is better.
 
This is not entirely accurate. I found out with the second generation Threadrippers that while their benchmark numbers showed they were good enough for 4K gaming, they actually weren't. The had lower lows and higher highs than other CPU's leading to a similar average but actual frametime analysis proved that they were incapable of providing a quality 4K gaming experience. At the time, only the 9900K was capable of maintaining a solid 60FPS+ framerate and none of the AMD CPU's I tested (which included the Ryzen 3000 series and Threadripper 2000 series) could do that.
That's why minimum FPS is more important than average.
 
I disagree, since performance difference will pretty much vanish at 4k. So a faster CPU at 1080p even if the performance gap is bigger, won't translate into better performance at 4K.
Again, back in the day the testing I did showed otherwise. While the numbers looked good for Threadripper, it wasn't capable of delivering a consistent and smooth gameplay experience at 4K in some games. The minimum frame rate often fell below 30 with huge spikes well beyond the other CPU's leading to misleading averages.

I have not tried to repeat that testing with the current crop of CPU's. And frankly, I don't think the issues would be as egregious with a modern Threadripper CPU or any other currently available CPU's. We know the modern AMD Ryzen 7000 series isn't as bad at gaming as the 2000 and 3000 series were. All the current Intel and AMD CPU's are a lot better for games. The only thing I would state is that I'd bet there isn't "no difference" at 4K. I just don't know how big a difference there would be between these CPU's.
That's why minimum FPS is more important than average.
I know. My take on this is simply that while the difference between CPU's at 4K shrinks considerably over other resolutions, my previous testing leads me to believe there is probably a bigger gap in CPU's at 4K than we might like to think. All the data I've ever seen and gathered myself over several years has always reached that conclusion.

But I could be wrong in regards to all the currently available CPU's.
 
Hi, did you run Counter-Strike 2 with the intel 14900k with disabled e-cores or commands like "-threads 9" in launch options? Thank you.
 
Become a Patron!
Back
Top