Intel Core Ultra 9 285K & Ultra 5 245K CPU Review

I can see how some people may want it for their use cases (it does really well in Python 3).
If I had to get a pure content-creation or otherwise compute-intensive build that wasn't a Mac... the 285K is where I'd be. I guess caveated that Threadripper wasn't on the table either, lol!
 
If I had to get a pure content-creation or otherwise compute-intensive build that wasn't a Mac... the 285K is where I'd be. I guess caveated that Threadripper wasn't on the table either, lol!
TRs are more or less out of the picture now unless you want to run one with a single stick of RAM. The prices of quad/octa channel platforms really hurt when it comes to RAM and that was BEFORE the RAM prices went crazy!
 
Weird, someone else other than myself did some testing and found AL performing decent in games at higher resolutions.

I guess I wasn't the paid intel shill everyone accused me of. I didn't see a lot of "niche situations" unless of course we now consider gaming at 4K "niche"
 
Weird, someone else other than myself did some testing and found AL performing decent in games at higher resolutions.

I guess I wasn't the paid intel shill everyone accused me of. I didn't see a lot of "niche situations" unless of course we now consider gaming at 4K "niche"
What was that gorilla marketer? I tease i tease...
 
What was that gorilla marketer? I tease i tease...
1763786691829.png

1763786761915.png

I can "feel" the jealousy written all over that!

So vote Intel and it's not favoritism but paying AMD more for X3D is "supporting" the billion dollar brand (AMD, the billion dollar gorilla! WOW!) and apparently, people need "convincing" by AMD fanboys to buy AMD CPUs :D

Good old loserbench :D
 
View attachment 4034

View attachment 4035

I can "feel" the jealousy written all over that!

So vote Intel and it's not favoritism but paying AMD more for X3D is "supporting" the billion dollar brand (AMD, the billion dollar gorilla! WOW!) and apparently, people need "convincing" by AMD fanboys to buy AMD CPUs :D

Good old loserbench :D
You're so sensitive. :) I hold no brand allegiance. Never have. Yours is telling. But it's all good. Boys be fans. :) some even market... in fhe jungles.

I want Intel to come out swinging and the pendulum to shift. Its good for all consumers when that happens. :)
 
I want Intel to come out swinging and the pendulum to shift. Its good for all consumers when that happens. :)
Don't get me wrong. I want the same to happen too but it's Intel that has made it extremely difficult for themselves for a half decent resurgence of their former glory.
 
If we take loserbench data without a grain of salt (yeah, I know!), the 9800X3D to 285K sale ratio is a staggering 28 to 1 (!!!).

Other data:

1763827098226.png
1763827122969.png


The crazy outliers seem to be:

1763826825113.png
1763827066548.png

One must ask, how many of these users are on their 2nd or 3rd or even nth Raptor chip???
 
Yea between failure rates due to lets call them... bad configs... to where we are at now. Kind of reminds me of the Xbox 360 and their insane numbers shipped.... I always wondered how many of those were due to RROD replacements.
 
Yea between failure rates due to lets call them... bad configs... to where we are at now. Kind of reminds me of the Xbox 360 and their insane numbers shipped.... I always wondered how many of those were due to RROD replacements.
I was working for a shipper at the time... literally loaded trailers for both 360s and PS3s, exclusively, that went straight to their respective repair depot addresses in south Texas (whereas others might go to a part of a state, or even groups of states depending on distance). At the time we knew how many could actually fit in each size of trailer lol.

That first mass run of unleaded solder really screwed a lot of consumer electronics.
 
My PS3 Slim was destroyed by cockroaches, of all things (out of warranty obviously). Called up Sony and they were like, sure, we will fix it. Only around $200!

I simply cursed the cockroaches and killed them with a vengeance from then on. And that's how I wasn't able to finish Catherine. I was more than two thirds through it.
 
I was working for a shipper at the time... literally loaded trailers for both 360s and PS3s, exclusively, that went straight to their respective repair depot addresses in south Texas (whereas others might go to a part of a state, or even groups of states depending on distance). At the time we knew how many could actually fit in each size of trailer lol.

That first mass run of unleaded solder really screwed a lot of consumer electronics.
Both our launch era 360s had the RROD. MS had streamlined the process by then and taken the 1 billion plus write off, so it was a fast and easy process at least. Gave us a newer version with HDMI each time. Those both eventually failed. I knocked one off the AV shelf trying to unplug it right after it threw up the RROD. Crashed 3 feet to the floor and the USB door on the front broke off. I was thinking that's great, now they won't repair it. I put it back, plugged it in, and it ran for another 5 years. Hypothesize for yourselves why it fixed it better than the X clamp mod. My first IT instructor told us the number 1 rule for repair is, if it does not work, use a bigger hammer. We all laughed, but I can't count how many times "the Fonzie" has worked for me.

Ended up selling the other one as nerfed since it was no longer covered. Replaced it with the S model for my son and that model finally ended the issues, it still worked the last time I fired it up a year or so ago.
 
Intel broke the latency on their chiplet configuration out of the gate on Arrow Lake.

Like it's possible that a new 'bin' could suck... less?, but the basic problem is that the very best they'll ever do is catch up to Raptor Lake unless they're willing to sacrifice some die space for a big chunk of additional cache.
 
unless they're willing to sacrifice some die space for a big chunk of additional cache.
If only someone at Intel had been man enough to fight for eDRAM. With current fab process tech, they could've included 512MB of eDRAM in the same space needed for 128MB eDRAM on Broadwell chips.

You need crazy people to fight for crazy stuff at companies. The mavericks. Just like the people who fought for Ryzen 395 Max and made 96GB VRAM possible in an iGPU. Sure, it doesn't matter much now but even the 64GB SKU will play future games for years and years with that much VRAM. You now have HANDHELDS with 128GB RAM. Would never have been possible without a fight. Certainly, Intel and Nvidia would never have greenlit something like that because they think consumers are born to be squeezed and exploited for their pleasure.
 
If only someone at Intel had been man enough to fight for eDRAM. With current fab process tech, they could've included 512MB of eDRAM in the same space needed for 128MB eDRAM on Broadwell chips.

eDRAM lost relevance as DDR4 speeds/latencies improved. Unless current eDRAM implementations were significantly faster/lower in latency than aftermarket DDR5, it wouldn't provide much benefit. Certainly not compared to stacked vcache.
 
Unless current eDRAM implementations were significantly faster/lower in latency than aftermarket DDR5, it wouldn't provide much benefit. Certainly not compared to stacked vcache.
512MB eDRAM would've had much better latencies AND bandwidth than at least 80% of worldwide DDR5 RAM sales. They just needed to iterate on the technology with incremental generational refinements. They failed with that. They failed with Optane. And people blame Krzanich for Intel's issues. Pat is the one who could've at least changed the company's downward trajectory with CPU performance focused decisions. But he acted like a fool who had forgotten all about CPU technology and architecture like he had studied that in kindergarten and could no longer recall the timeless lessons. And oh, who killed Royal Core and Beast Lake?
 
Taken from reddit

DDR3 2600, and a 4.2ghz OC is about the best it will get for that CPU.

I have one that does 4.1ghz and has 2133 DDR3.


5775c aida.png
I built a 5775c system just for fun because I didn't have the means to buy one when the chip first came out because it was impossible to find (and expensive)

The motherboard I have (Asrock Z97) has a Gen 3 4x NVME slot and a a Gen 2 slow, so it can run relatively fast storage.

It's a fun little system that I put my 4090 and 5090 in to mess around.

i7 5775c RTX 5090 3Dmark first run.PNG
All that said, it really does show it's age but it's not completely useless.
 
Become a Patron!
Back
Top