Entire Family of AMD Ryzen 7000 Series Processors Could Feature Integrated Graphics

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,575
Points
113
amd-ryzen-badge-circle-1024x576.jpg
Image: AMD



AMD’s entire family of Ryzen 7000 Series processors could feature integrated graphics. This is according to client roadmaps shared by r/AMD’s dudulab and Chiphell’s zhangzhonghao, which speculate that the Zen 4-based Ryzen 7000 desktop CPUs (code named “Raphael”) will feature higher-density, 6-nanometer I/O dies and Navi 2 graphics. This has fueled rumors of red team adding integrated GPUs across the entire lineup of its next-generation chips. “Phoenix,” the Ryzen 7000 mobile variants, will also purportedly flaunt RDNA 3 graphics technology.







What’s new is the (apparent) addition of Navi graphics to Raphael, the Zen 4-based Ryzen 7000 desktop chips...

Continue reading...
 
This could be huge, one advantage Intel has in corporate/offices is the integrated graphics with a desktop-class CPU, meaning they can cut the cost of a dedicated GPU which is not needed for mainstream office work. If AMD can also have desktop-class CPUs with integrated graphics, those companies now have more options, and it could help AMD CPUs become more mainstream in those environments, whereas Intel kind of rules right now.

The other interesting thing from this graphic is it is predicting a Zen3+ refresh on the 6N node. From what I understand, 6N allows the same IP as 7N, it's basically 7N with a better density.
 
I personally have few usage scenarios where I would WANT an APU (my server would be one) but it's nice to have options in the frozen tundra wasteland of GPU scalper nightmares....

edit - yeah and also corp desktops like Brent said
 
I personally have few usage scenarios where I would WANT an APU (my server would be one) but it's nice to have options in the frozen tundra wasteland of GPU scalper nightmares....

edit - yeah and also corp desktops like Brent said

More Corp laptops than desktops. My Corp has gone away from 3 year refreshes without pulling g strings to 4 to 5 year refreshes when parts fail.
 
This could be huge, one advantage Intel has in corporate/offices is the integrated graphics with a desktop-class CPU, meaning they can cut the cost of a dedicated GPU which is not needed for mainstream office work. If AMD can also have desktop-class CPUs with integrated graphics, those companies now have more options, and it could help AMD CPUs become more mainstream in those environments, whereas Intel kind of rules right now.
AMD should technically already be there with eight-core APUs. I've seen desktops still in service running Dozer APUs too, and their biggest issue was the spinners and 8GB of RAM trying to run a modern corporate Windows image. On their own they weren't terrible!

The big advantage here, I think, is that the CPUs will now feature AMD GPU acceleration IP, in particular the transcoders and an ML block for whatever they come up with that winds up competing with RTX Voice / Studio. Which is going to be necessary as Intel releases their own rumored version too.

I personally have few usage scenarios where I would WANT an APU (my server would be one) but it's nice to have options in the frozen tundra wasteland of GPU scalper nightmares....
This is the big one for me. I bought an IGP-less Intel part a few years back thinking 'eh, doesn't need the GPU', and I've regretted it since for a number of reasons. Granted I'm an enthusiast so every part is always under consideration for change; for a single-purpose system, this is less of a 'need'.

More Corp laptops than desktops. My Corp has gone away from 3 year refreshes without pulling g strings to 4 to 5 year refreshes when parts fail.
As above, AMD should already have this covered; though assuming @Brent_Justice's comment about the smaller node being in use is accurate, we could easily see 12- and 16-core CPUs hitting mid-sized laptops. Or more, if there's a TDP ceiling on the table, meaning clock speeds would be limited a bit.

I'm sure AMD could get such a CPU in a laptop like my XPS 15 if no discrete GPU were present, it already runs this Skylake-derived 14nm CPU just fine!

1617711529819.png
 
Ideally this will eliminate the need for low end graphics.
In general, that's been eliminated. Intel's IGPs have been 'enough' for well over a decade, and AMDs better than that for longer.

And if these IP blocks stay enabled for Epyc CPUs? Well, that'd be epic :).

Really. It'd be nice for everything, even servers to have USB4 ports that also serve as DP / HDMI ports with the right cable, and for servers to have proper GPUs for those times when they're called for - even just for accelerating remote sessions (better compression / higher quality / more responsive) for long hauls, or multi-user desktop environments, and so on. Niche cases to be sure, but since it'd technically be built-in, why not?
 
The only use for low end graphics cards are so the retailers can legitimately have something in stock for those who are truly desperate.
 
The only use for low end graphics cards are so the retailers can legitimately have something in stock for those who are truly desperate.
I bought a GT710, and I was truly desperate. I derived pure comedic joy from the simulflection of that purchase. I became the one I swore I'd never be!
 
AMD should technically already be there with eight-core APUs. I've seen desktops still in service running Dozer APUs too, and their biggest issue was the spinners and 8GB of RAM trying to run a modern corporate Windows image. On their own they weren't terrible!

Yeah, I was about to say, AMD already does. I run an AMD Ryzen APU driving 3x24" monitors natively from a USFF PC by a Tier 1 provider (HP). You don't get much more "corporate/offices is the integrated graphics with a desktop-class CPU" than that.
 
Why on earth would I want, or even need, an iGPU wasting die / package space on the next 5950 (7950?). I suppose, maybe if the GPU shortage lasts indefinitely and my 1080ti eventually fails and I can’t buy a replacement. Even then, I’ve got a 560 backup card that is still likely better than any iGPU they will include.

Let the APUs have the iGPU, leave it off everything else.
 
I am all for it, if they can get good performance AND add a gpu, count me in.
Make a great Plex box, server.
 
Let the APUs have the iGPU, leave it off everything else.
Eh, they don't take up that much space in a minimal implementation, which is what I'd hope this is. It's not about winning a race, it's about providing a common set of base functionality.
 
Eh, they don't take up that much space in a minimal implementation, which is what I'd hope this is. It's not about winning a race, it's about providing a common set of base functionality.
But that base set of functionality is literally worse than leaving it off. It takes up die space and some minimal amount of power for something nearly every person using the chip will turn off.
 
something nearly every person using the chip will turn off.
But will they?

I get not using the video outputs constantly. But the transcoding blocks? Some ML acceleration to clean up VOIP noise real-time with no CPU load or some other local signal processing? Literally every USB port being a hardware accelerated video output?

No, it's not needed, but there's enough usecases to argue for the inclusion, or rather, against the wholesale exclusion of a 'GPU' block on every CPU.
 
But will they?

I get not using the video outputs constantly. But the transcoding blocks? Some ML acceleration to clean up VOIP noise real-time with no CPU load or some other local signal processing? Literally every USB port being a hardware accelerated video output?

No, it's not needed, but there's enough usecases to argue for the inclusion, or rather, against the wholesale exclusion of a 'GPU' block on every CPU.
My assumption is the target audience of the X950 CPUs are enthusiasts that will already have a video card.

Professionals that need CPU or bandwidth will opt for threadripper. Bang for the buck shoppers will go for the X900 series, but will also probably be running a mid range video card as well.

i just don’t know who the user is that wants 16 cores but also wants cheap integrated video.
 
My assumption is the target audience of the X950 CPUs are enthusiasts that will already have a video card.
Why?
Professionals that need CPU or bandwidth will opt for threadripper. Bang for the buck shoppers will go for the X900 series, but will also probably be running a mid range video card as well.
Again, why?
i just don’t know who the user is that wants 16 cores but also wants cheap integrated video.
Well, it's integrated, but why does it have to be 'cheap'?

Hard to count the number of i7 systems I find in corporate desktops / laptops that have no GPU. Many are to the point of being little larger than NUCs, supporting half-height, half-length cards if anything at all.

And the way I approach the question is this: are there workloads that use tons of CPU but don't need graphics capability beyond supporting desktop output?

The answer to that is very clearly yes, and at every tier of CPU even.

Granted most folks don't actually need a whole lot of CPU grunt. See the success of Apple's M1; the CPU is quite inferior as a 'CPU', but as Apple and many others long realized, most of what consumers use CPUs for is actually to fill in for the lack of affordable fixed-function hardware availability for whatever it is they're trying to do beyond writing an email or entering data into a spreadsheet. Welp, the M1 does all that well, and nearly anything else that a consumer needs to do, in 1/4 the power envelope (or far less, in the case of video editing!).

Point being, if the workload actually calls for a CPU, then a CPU is needed. Not a GPU. So why make folks buy trash GPUs just for video output? Ever? Why bother making them at all? Why not just make sure that every CPU block (whatever the smallest discrete die is in a CPU package) has the ability to run a few displays out of the USB4 ports?

Realistically, 'CPUs' should become a thing of the past. They should just be 'processors'. Whether full SoC as seen in the 'end user' class or just a processing block on a larger package as seen in the workstation- and server-class, a few video cores that can handle output, streaming, cleaning up signals with ML, and perhaps run a low-end or old game aren't going to hurt. If they're ubiquitous, they might even start seeing more use and more standardization!
 
Become a Patron!
Back
Top