AMD Is Rumored to Be Launching Ryzen 7000 X3D Series in 2023

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
9,032
Points
113
AMD is rumored to be launching X3D versions of its new Ryzen 7000 series processors in 2023 following a leaked slide showing a current roadmap. The slide titled "2022-2023 Channel...

Go to post
 
Neat I guess. I'm thinking 2024 before I look at updating my cpu, motherboard, ram, and storage again. And storage is a maybe.
 
I'm like Grim, I'm good for now and wasn't considering upgrading. But unless I were in dire need, given how much better the 5800x3d improved things, I'd strongly consider holding off for the 7800x3d for a new gaming rig. Not that the 7000 series looks bad, just that cache really did help gaming a good deal more than I expected.

OTOH - it should be a drop-in upgrade on this platform... so you could bite the bullet now and just upgrade the CPU later on.
 
OTOH - it should be a drop-in upgrade on this platform... so you could bite the bullet now and just upgrade the CPU later on.
If you have a good outlet for used hardware, could go the 7600 now and then sell it and upgrade to the 7800x3d in the summer. It’s actually a pretty good plan if you had that outlet.

The hint of a threadripper is interesting - but only if it’s early access to a 9000 series architecture. I’m completely uninterested in buying a 7000 series threadripper right before the next gen drops, but if I could buy into the next gen right away I could see myself going that route.
 
Calling Dr. Obvious... :rolleyes::rolleyes:
Well, I did that because there was an earlier leaked image looking like something from a 1960s Batman cam shot that didn't include the whole slide. Even VideoCardz later updated its own story with the new image showing the whole slide title from a different source and then indicated that this was likely a presentation for channel partners but we all have to be really careful with leaks due to take down requests. I admit we all pretty much know what AMD will likely do but it's still a tricky dance for how to say it.
 
If you have a good outlet for used hardware, could go the 7600 now and then sell it and upgrade to the 7800x3d in the summer. It’s actually a pretty good plan if you had that outlet.
It's not that the 7600X is not a fast CPU - but that it's still a US$300 6-core CPU that requires DDR5, and that the first release of motherboards and CPUs are having a hard time even hitting DDR5 6400 speeds.

Now, we know AM5 is going to be around for a while, and that future AM5 CPUs might push memory speeds significantly further on current boards. There's just no guarantee.

The hint of a threadripper is interesting
Honestly yeah. Most X670E and even many Z690 (and probably many upcoming Z790) boards are still connectivity compromises. But AMDs track record with TR is pretty disappointing, probably worse than Intel's with their HEDT line, if only just.



And to the point of the expected Zen 4, AM5 CPUs with stacked L3 cache - the fact that the 7700X is about as fast across the board in games, averages and 1.0% lows, shows that more L3 cache helps but that it doesn't have to be that much more. And with Intel doing the same thing with 13th-gen / Raptor Lake, we might be at the point where pretty much all mid-range and higher CPUs are outrunning available GPUs.

Looking forward to how the 5800X3D, the AM5 Zen 4 CPUs, and Raptor Lake all compare here. With B650 boards imminent, the entry price for AM5 will drop too, and buyers will have quite the range of options!
 
And it really needs to.
Does make me wonder what the longest-serving architecture is/was.

Vega is only ~5 years old (Aug 2017), although it feels so much older since that was Raja's first over-hyped baby and COVID basically lasted forever.

A quick and probably completely inaccurate Google Search of "Longest Produced Graphics Chipset" points me to the Matrox MGA, which started production in 1978, and was used in some form or fashion clear through the Matrox Mystique and Millennium -- the last iteration of the MGA was 2001, when it was finally replaced by the Parhelia.
 
And it really needs to.

Eventually, someday we will indeed get the apu revolution and completely kill off low end dedicated cards. Totally, someday... maybe tomorrow or in ten years but totally the low end vga market is dead, as a dodo. Soon.
 
Eventually, someday we will indeed get the apu revolution and completely kill off low end dedicated cards. Totally, someday... maybe tomorrow or in ten years but totally the low end vga market is dead, as a dodo. Soon.
Pretty sure the last generation did that for us. No one is really playing in the <$300 market any more, expect Intel trying.

AMD brought out a couple of half hearted attempts with the 6400/6500, and nVidia rolled some old rehashes (2060 and 1050Ti) -- but all of it was pretty poorly received, generally priced much too high for the relative performance, and had availability issues.

Granted, I haven't went back to look since the mining boom has come crashing down, but I also haven't really heard of anyone except Intel really aggressively going after that low-end segment. AMD used to chase it hard, even just a generation back, and nVidia at least gave it token service when they split the RTX branding off. But as of this past generation (nvidia 3000 / AMD 6000) - I think it's dead.
 
That is because the card makers were chasing mining profits. What I was really referring to though was the 4-5 year long cry that the time of APU's was nigh, we just had to wait 12 more months or something. Where are the APUs that can run 1080p games effectively? Maybe the 5600g/5700g, sort of can but you'd likely be better off with the mentioned 64/6500 or a 3-5 year old low end dedicated card like a 1060 3/6GB or RX570/580.

The mining crash has so far led to $300 6650XT cards which is at least 20% faster than a 2060Super...for about $100 less 3 years later.
 
That is because the card makers were chasing mining profits. What I was really referring to though was the 4-5 year long cry that the time of APU's was nigh, we just had to wait 12 more months or something. Where are the APUs that can run 1080p games effectively? Maybe the 5600g/5700g, sort of can but you'd likely be better off with the mentioned 64/6500 or a 3-5 year old low end dedicated card like a 1060 3/6GB or RX570/580.

The mining crash has so far led to $300 6650XT cards which is at least 20% faster than a 2060Super...for about $100 less 3 years later.
Its not a technical issue, its a marketing one.

AMD is not interested in giving PCs PS5 level graphics (not even PS4), simply because it would hurt its xbox/sony business.
 
Its not a technical issue, its a marketing one.

AMD is not interested in giving PCs PS5 level graphics (not even PS4), simply because it would hurt its xbox/sony business.

I was going to dispute this but I think you're right.
 
AMD is not interested in giving PCs PS5 level graphics (not even PS4), simply because it would hurt its xbox/sony business.
I was going to dispute this but I think you're right.

Hmm, it's a possibility.

It may also be possible that part of the contract with MS/Sony is that they do not manufacture a part that is direct competition.

I can't imagine that AMD makes more money on a bulk contract than they would retail sale - the margin would be much higher on retail sales... but then again, volume, and AMD doesn't have the best OEM resell partners (not that Intel helps out there)
 
With regards to APUs , see zero reason as to why they cant make a package with a cpu, gpu and hbm of very significant muscle. I get it, before the low end dgpu segment was well served, but that is not the case any more. A big fat APU with hbm I think would serve the market much better than the current sad state of affairs. AMD can do this, I think they could for years now. I don't understand why they don't attempt it... Console non compete maybe?
 
I don't understand why they don't attempt it... Console non compete maybe?
Console non-compete is certainly a possibility, but I don't think that is it. I admit I have no knowledge of one, or lack of one, I can't confirm that.

But what I suspect is rather AMDs lack of relationships with the OEMs. Most of those chips get sold via pre-built systems, folks like us are a niche industry. AMD has lousy relatioships with all the various OEM builders-- Dell, HP, Lenovo, etc.

Yeah, you can find ~some~ AMD models, but the vast, vast majority are Intel-based. Part of that is dirty pool on the part of Intel (and they have been caught and convicted of it in the past), but I can't say it's all of that. AMD needs to build up those relationships, get their chips in more pre-built systems, and it will go a long way.

Now - you don't ~have~ to have that avenue to technically produce a chip like this. But the HBM substrate is going to be pricey, and putting a part out there like that into the enthusiast/DIY market does impact your lower-end discrete GPU strategy, which has long kept AMD afloat for years now. Trying to price it between all of those constraints and keep it profitable could be tricky.

But I bet if Dell or HP or Apple wanted to put out such a system, it would exist. Apple got Intel to do custom chips (Crystalwell), after all, and the console manufacturers got it, obviously. But it's a big risky bet to throw it out to the niche DIY audience that's mostly accustomed to going with a dGPU no matter what's on the CPU.
 
With DDR5, it starts to become possible to build a performant APU.

However, you run into several challenges - the first being price, of course. Not really from the perspective of manufacturing costs, but rather in having too low of a volume to keep unit costs down. Low volume is also risky from a yields standpoint.

Then they'd have to make sure that the memory bandwidth is there - which means, more or less, having JEDEC standards to support the bandwidth, as well as memory controllers that can handle consumer DIMMs.

And then there's packaging, which brings the issue of heat. AMD has done themselves no favors by maintaining 'compatibility' with AM4 coolers, since backplates cannot be exchanged with AM5, and by using an LGA socket that sits the CPU dies lower in the package, necessitating a thicker IHS, they now have the problem of their cores running 'hot'.
 
Become a Patron!
Back
Top