AMD’s “Zen 4” Ryzen Processors Will Feature Integrated Graphics

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,871
Points
113
amd-ryzen-5000-series-vermeer-black-chip-render-1024x576.jpg
Image: AMD



AMD users who plan to build a new system around the company’s next generation of Ryzen chips may not necessarily have to worry about coupling it with a dedicated graphics solution.



Chips and Cheese has shared documents that suggest all upcoming Ryzen processors based on the Zen 4 architecture will feature iGPUs. (See the “on-chip graphics” row in the compatibility table below, which indicates dedicated graphics.)



It isn’t clear whether these iGPUs will actually be enabled in all models, but the news is good for Ryzen users who only need a relatively moderate level of graphics performance.



The documents stem from the recent GIGABYTE leak, which has revealed plenty of other interesting information such as the specifications for AMD’s upcoming Ryzen Threadripper processors.



A new block diagram shared by Chips and Cheese has created...

Continue reading...


 
Hmm..

I thought there was already a lineup with integrated graphics, they just called it an APU.

What's the difference here?
 
Just what I don’t need - integrated graphics on a high performance CPU. If I want basic graphics, I’ll toss in a 710.

how about using that die space for extra cache instead?
 
Well, these days it will let you at least use the system while you wait your turn in the Newegg shuffle.
lol! That said, I don’t think the 710 ever went out of stock anywhere. Even the RX 560 and 1030ti were almost always in stock when I was shopping back in Jan.
 
Hmm..

I thought there was already a lineup with integrated graphics, they just called it an APU.

What's the difference here?
An APU should have more grunt, at least in my imagination of what it's supposed to mean. Sadly, Intel has done that better, and mobile SoCs have really embodied the form more than anything else.

A lot of that comes down to software integration, though. AMD continues to make strides here so these may have more 'impact' than we might otherwise expect!

Just what I don’t need - integrated graphics on a high performance CPU.
If they follow what Intel has done, the highest (and lowest) CPU brackets will have the more limited implementations. And with AMDs chiplet approach, that may mean that they can use different dies with different balances of say CUs and cache, if I'm correct in assuming that they'd situate the graphics logic on the cache / uncore die.

Outside of that, I'm betting that the lack of integrated graphics by design has hurt them a bit, and the GPU market has likely exacerbated the effect. I'll also venture to say that the graphics logic that AMD includes could easily have more features than your average low-end GPU, looking in particular toward things like video encoding and decoding performance. Stuff that end-users are likely to do that just isn't efficient on CPU cores.
 
At this point it makes sense. The graphics part is probably a small part of the die.
 
At this point it makes sense. The graphics part is probably a small part of the die.

Well, this is an Ice Lake CPU. I wouldn't call the IGP small exactly.
For AMD though - they can use one (or two) chiplets for graphics and they don't really need to change their CPU chiplet die at all. Which is what I thought they were doing with APUs

ice_lake_die_%28quad_core%29_%28annotated%29.png
 
Well, this is an Ice Lake CPU. I wouldn't call the IGP small exactly.
For AMD though - they can use one (or two) chiplets for graphics and they don't really need to change their CPU chiplet die at all. Which is what I thought they were doing with APUs

ice_lake_die_%28quad_core%29_%28annotated%29.png
Now just think how much better Ice Lake would have been had they dedicated that whole GPU block to L3 cache…
 
AMD is feeling pressure from intel, who are serious about their iGPU strategy. If AMD doesn't reply, it may have to play catchup in the near future.
 
AMD is feeling pressure from intel, who are serious about their iGPU strategy. If AMD doesn't reply, it may have to play catchup in the near future.
Well, thinking about it strategically --

People who buy an APU (or CPU with IGP - not sure there is a distinction) and plan on using the video, probably aren't buying a dGPU anytime soon, and AMD is very much in the business of wanting to sell you both.

The market for IGP/APUs has mostly been mobile and business devices - low margin commodity markets, where you can really only make nay money with large volumes - AMD isn't going to get that overnight.

And Intel really has only been pushing IGP in the past... decade+ because it didn't have a strong dGPU product to push, and it allowed them to gain in those marketspaces and get established. I think IGP is fine, so long as I never ~need~ to use it for anything serious.
 
I'm torn. On one hand a basic level functioning 2d desktop would be nice for stuff where you just need basic video. OTOH I wouldn't want to give up die space that could be used for better performance like l3 cache etc.

I miss the old AM3 motherboards that had built in crappy graphics. You couldn't game on them, but perfectly fine to build on or run a server.
 
I'm torn. On one hand a basic level functioning 2d desktop would be nice for stuff where you just need basic video. OTOH I wouldn't want to give up die space that could be used for better performance like l3 cache etc.

I miss the old AM3 motherboards that had built in crappy graphics. You couldn't game on them, but perfectly fine to build on or run a server.
I kinda agree. Why can't get get video pushed back out onto the chipset controller? I get that it will suck donkey balls since it will be removed from, pretty much everything... But we aren't exactly wanting it to drive games - it's just to drive a basic desktop and base level functionality.
 
I kinda agree. Why can't get get video pushed back out onto the chipset controller? I get that it will suck donkey balls since it will be removed from, pretty much everything... But we aren't exactly wanting it to drive games - it's just to drive a basic desktop and base level functionality.
In this case, just use a GeForce 710 or equivalent. I’m sure there are scenarios where you would really like to use the PCIE slot for something else, but a 1x riser should be fine for basic graphics.

edit to say, I’m also the guy that is annoyed at the raspberry pi for including graphics hardware that I just turn off because I never use it. I’d rather have 8 cores and 0 iGPU…. So I may not be the best person to find value in the igpu
 
Last edited:
In this case, just use a GeForce 710 or equivalent. I’m sure there are scenarios where you would really like to use the PCIE slot for something else, but a 1x riser should be fine for basic graphics.
No, it's value is in not needing to plug in a card at all - mostly troubleshooting, or linux boxes that sit at the CLI anyway, or things like that.

Sure, you could use a bottom tier discrete card, but if there's already some rudimentary video available - why even do that?

As for the rpi - if it didn't have a GPU at all, your only option would be to SSH into the unit, it wouldn't have any local video output capability at all (I guess someone could invent a GPIO GPU of sorts, but it would be complicated) - and that's exactly the type of capability I'm saying is nice to have here. I wouldn't call it essential, but nice in those situations where even a 710 is pretty much overkill.
 
No, it's value is in not needing to plug in a card at all - mostly troubleshooting, or linux boxes that sit at the CLI anyway, or things like that.

Sure, you could use a bottom tier discrete card, but if there's already some rudimentary video available - why even do that?

As for the rpi - if it didn't have a GPU at all, your only option would be to SSH into the unit, it wouldn't have any local video output capability at all (I guess someone could invent a GPIO GPU of sorts, but it would be complicated) - and that's exactly the type of capability I'm saying is nice to have here. I wouldn't call it essential, but nice in those situations where even a 710 is pretty much overkill.

Yeah, and for those that use a PI for an emulator, which are a few people, that would be kind of a problem with no graphics output.
 
No, it's value is in not needing to plug in a card at all - mostly troubleshooting, or linux boxes that sit at the CLI anyway, or things like that.

Sure, you could use a bottom tier discrete card, but if there's already some rudimentary video available - why even do that?

As for the rpi - if it didn't have a GPU at all, your only option would be to SSH into the unit, it wouldn't have any local video output capability at all (I guess someone could invent a GPIO GPU of sorts, but it would be complicated) - and that's exactly the type of capability I'm saying is nice to have here. I wouldn't call it essential, but nice in those situations where even a 710 is pretty much overkill.
I can use one card for multiple hosts. I can put my 710 in my freenas tomorrow if I needed the minimal interface it has, and then the day after I could put it in my pfsense box, etc. for the once every 3 years I might need a console, I can toss in a card. The benefit I would have is higher continuous performance during those 3 years where I didn’t need a Gpu at all.

as for the Pi, I have a number of pi 4s at the house - around 20 off them. My Kodi boxes are the Only 3 that actually have ever used the video output. The rest just get SSH’d to, including immediately after setup.
 
I can use one card for multiple hosts. I can put my 710 in my freenas tomorrow if I needed the minimal interface it has, and then the day after I could put it in my pfsense box, etc. for the once every 3 years I might need a console, I can toss in a card. The benefit I would have is higher continuous performance during those 3 years where I didn’t need a Gpu at all.

as for the Pi, I have a number of pi 4s at the house - around 20 off them. My Kodi boxes are the Only 3 that actually have ever used the video output. The rest just get SSH’d to, including immediately after setup.
Well the other part of my statement was putting the GPU in the chipset controller…

But to each their own. I’d rather not juggle cards at all if I can help it
 
As long as I get the performance I want, and the capabilities I will use, in addition to no real additional cost... I don't care?
 
Become a Patron!
Back
Top