AMD Ryzen 7 9850X3D CPU Review: Fastest 8-Core

Brent_Justice

Administrator
Staff member
Joined
Apr 23, 2019
Messages
1,075
Reaction score
1,969
Introduction The AMD Ryzen 7 9850X3D was announced at CES 2026, and now it is launching with availability on January 29th, 2026, with an MSRP of $499. The AMD Ryzen 7 9850X3D is AMD’s top-end single-CCD spec with 3D V-Cache with the intent to directly provide the best gaming experience, and is marketed as a […]

See full article...
 
That is an impressive 8 core single CCD CPU. Very nice...

Would be neat to see what a single CCD 12 core 3dvcache CPU would be awesome. :)
 
This is the only review showing less power use vs the 9800x3d, all other tests show the 9850x3d consuming more power.1000028804.png1000028803.png1000028802.png

 
This is the only review showing less power use vs the 9800x3d, all other tests show the 9850x3d consuming more power.
Brent can probably speak to this better, but there's a ton of variables that go into power consumption - including the game, the game resolution and game settings that are selected for testing. We don't typically do far ranging power analysis - Toms, for example, seems to be running all their standard tests through benchlab which makes it easy to pull together a ton of power graphs that correlate to the benchmarks run. Our power testing tends to focus on games.

There can also be differences in where the power is measured - benchlab (which they used on the power testing page) is an external to the system pass through that doesn't directly measure CPU usage, whereas we're pulling our power measurements from HWiNFO64. Though, really, I can't tell what method they used to measure CPU power usage in their gaming benchmarks as it's not disclosed....
 
As stated in the review: "
This is the only review showing less power use vs the 9800x3d, all other tests show the 9850x3d consuming more power.View attachment 4358View attachment 4359View attachment 4360

As stated in the review: "On this page, we are going to test the power draw on the CPUs, testing multi-core performance in Cinbench 2026 running for 15 minutes."

The method I used (and always have used) is a simple Cinebench Multi-Core, full-load, power test. I run Cinebench on the multi-core benchmark for at least 15 minutes, looping. I have HWiNFO up on the screen, and monitor the "Package Power" reported. I then graph the maximum or peak power number represented.

I do not claim that this is a thorough or detailed power test. It is meant to verify or validate all-core power loads and report on the quoted or rated TDP from AMD and Intel. I am aware that TDP is the thermal design power recommendation for cooling.

I do claim this is a valid measurement, and a real-world use-case scenerio that an end user would realistically encounter in real-world usage. For example, if you are editing and encoding or transcoding video in a video editor, or running 3D renders, or anything that pushes all cores to the max, this is the power you would experience in a real-world use case.

This is different than testing power while gaming. This is not showing power while gaming. I have done that in the past, on certain CPUs, but this was not that. You will note that Tom's testing was in games, which has a very different load than what I was doing. I was using productivity-based testing.

In addition, I am aware there are more accurate ways to capture power usage, but our reviews don't focus on that aspect. This method shows the 'cap' that the motherboard BIOS sets for package power. We can see that power is above the "TDP" number they provide, and we can see where it caps out with this method. I encourage you to look at other reviews for more detailed power testing and analysis.

I hope that helps.
 
One thing I'll also add to my statement: Testing power while gaming will be very different from all-core full-load testing because, while gaming, the CPU cores will boost to higher frequencies and voltages, since less of them are being used, and the workloads are lighter. Therefore, the peak power registered on such things as the EPS rails will be higher and peak higher than pushing all-cores to full-load at the same time. When doing that, the clock frequency and voltages are lower, and thus it peaks lower overall.
 
That makes sense to me. You're not investing in power measurement devices like what others are doing as that is not your intended level of testing. This is more a user experience and repeatable test than say a specialized break down. Its clear tne texhnical chops are here and are in essence providing the tools and guidance to replicate local testing by us more... savvy I dare say... readers. Letting us more easily identify anomalous behaviors.that may or may not exist. (Looks cautiously at ASROCK motherboards.)
 
Nice CPU for sure but I'll be waiting for one (either the 9950X3D2) or something else down the road to replace the 5800X3D in my spare rig and the main one with a 9800X3D is plenty for now.

As always, thanks for the review!
 
why all the hoopla on a CPU that's not worth buying?

It's only a little bit faster than the 9800X3d, but is more expensive, draws more power and generates more heat.

I don't see any scenario where getting a 9850X3D over a 9800X3d makes sense.
 
why all the hoopla on a CPU that's not worth buying?

It's only a little bit faster than the 9800X3d, but is more expensive, draws more power and generates more heat.

I don't see any scenario where getting a 9850X3D over a 9800X3d makes sense.
The folks who have to have "the best"
 
why all the hoopla on a CPU that's not worth buying?

It's only a little bit faster than the 9800X3d, but is more expensive, draws more power and generates more heat.

I don't see any scenario where getting a 9850X3D over a 9800X3d makes sense.

Should depend on pricing and availability more than anything. My guess is that we'll see the 9800X3D chips start drying up leaving the 9850X3D as the only choice and/or the price point will be so close that the extra $10-20 might as well be spent.

My original theory is the 9800X3D will go away and be replaced by a lower binned 9700X3D to provide some separation, but AMD has stated the 9800X3D production will continue (of course, that may mean just for one day....)
 
My guess is that we'll see the 9800X3D chips start drying up leaving the 9850X3D as the only choice and/or the price point will be so close that the extra $10-20 might as well be spent.
I've seen this suggested by others. I think they have contacts too. Starting to think this is a leak. :p
 
If the rumors of the 9700X3D are true, that's one more kick to Arrow Lake. The 265K will lose its appeal quickly since the 9800X3D's $449 price was making it turn up in the shopping cart of some people. Now people like that could opt for the 9700X3D if they can get it at something like $389.
 
Using canadian pricing as an example, I don't see an imaginary CPU causing any issues for intel.

265KF is $399
9700x is $460
7800X3D is $550
14900K is $620
9800X3D is $640
9850X3D is $720
285K is $780

Intel is the value leader right now. Given the price of ram. A full $240 difference between a 265k and 9800x3d. I won't even talk about the 9850x3d because the pricing makes zero sense. $80 more Cad$ is just terrible.
 
7500X3D is available locally for me for less than 265KF. Great move as it should work decent even with DDR5-4800.
You read that computer base article right? X3D can use rock bottom JEDEC, and in single channel no less, while losing very little gaming performance. It is reflected in retail sales postions. LGA 1700 easily outsells 1851 here in the states because of DDR4. AMD dominates sales because of a number of factors. Intel's DIY mindshare is at an all time low. AM5 has the fastest gaming CPUs, and offers an upgrade path. AM4 is cheap, ubquitous, and DDR4.

Arrow refresh looks to be as big a snoozer as this SKU. Nothing exciting going to happen until Nova. Where are my Intel desktop APUs that can game like the ARC 390? You know how much I like APUs. Where is my Barlett drop in for Z790? Everyone is ignoring us for the AI gold rush. Especially Red and Green.
 
Where are my Intel desktop APUs that can game like the ARC 390?
Probably never gonna happen. A performant iGPU like that needs gobs of RAM bandwidth and we know high speed RAM is now the baby that was thrown out with the bathwater. I used to wonder why AMD never had good APUs (better than 8700G) and the answer seems to be because they knew it would be bottlenecked by their low speed RAM support hence why Strix Halo had 8000 MT/s RAM for it to make sense as a real product.

It's gonna be fun reading the Ryzen AI desktop APU reviews when it finally gets launched. Hopefully someone (maybe computerbase) will test it all the way up to 9600 MT/s to see exactly at which speed the memory bottleneck gets alleviated and the tiny GPU itself becomes the problem.

Where is my Barlett drop in for Z790?
That's Pat all the way. He just didn't want anything competing with Arrow Lake. LBT is a "science" or engineering guy and not a PC guy. He will listen to whatever he is told by the people Pat hired or placed to lead the existing teams. He is not gonna go, hey, if Raptor Lake beats Arrow Lake and Bartlett Lake is just rebranded Raptor Lake so why can't we just make a Bartlett Lake Gaming Limited Edition CPU to help the sales along?
 
Become a Patron!
Back
Top