Entire Family of AMD Ryzen 7000 Series Processors Could Feature Integrated Graphics

Why?

Again, why?

Well, it's integrated, but why does it have to be 'cheap'?

Hard to count the number of i7 systems I find in corporate desktops / laptops that have no GPU. Many are to the point of being little larger than NUCs, supporting half-height, half-length cards if anything at all.

And the way I approach the question is this: are there workloads that use tons of CPU but don't need graphics capability beyond supporting desktop output?

The answer to that is very clearly yes, and at every tier of CPU even.

Granted most folks don't actually need a whole lot of CPU grunt. See the success of Apple's M1; the CPU is quite inferior as a 'CPU', but as Apple and many others long realized, most of what consumers use CPUs for is actually to fill in for the lack of affordable fixed-function hardware availability for whatever it is they're trying to do beyond writing an email or entering data into a spreadsheet. Welp, the M1 does all that well, and nearly anything else that a consumer needs to do, in 1/4 the power envelope (or far less, in the case of video editing!).

Point being, if the workload actually calls for a CPU, then a CPU is needed. Not a GPU. So why make folks buy trash GPUs just for video output? Ever? Why bother making them at all? Why not just make sure that every CPU block (whatever the smallest discrete die is in a CPU package) has the ability to run a few displays out of the USB4 ports?

Realistically, 'CPUs' should become a thing of the past. They should just be 'processors'. Whether full SoC as seen in the 'end user' class or just a processing block on a larger package as seen in the workstation- and server-class, a few video cores that can handle output, streaming, cleaning up signals with ML, and perhaps run a low-end or old game aren't going to hurt. If they're ubiquitous, they might even start seeing more use and more standardization!
The business desktop use case is covered by the APU - AMD has a lot of those. On the Intel business side it’s i3s and i5s. In 18 years of working for fortune 100s it was only recently when I got to a high enough level to be able to order higher end hardware.

i think we can rule out the 5900 and 5950 from ever being used as a desktop in data entry or call center positions.

if we want to talk programmers, sure, we have a use case for more compute power. For example, earlier today at work I needed to generate 10 billion data points for a test - I just ssh’d to one of my 32 core / 64 thread Rome based VMs and launched my test. If I had a specific reason to run it locally, I would just fill out a form requesting a Xeon or a threadripper 3990x, but I really have no justification for it - the 32 core VMs work just fine for what I need, and the threadrippers come with a 3090 assuming you’re going to do some local ML work.

So if most business cases are handled by an APU or using cloud based VMs, who is using the 5950? If it’s an enthusiast gaming, doing Video work or ML, they are going have a video card. So what’s left? Who is seeking out a 5800/5900/5950 over an APU that would use integrated video if it were included?
 
The business desktop use case is covered by the APU - AMD has a lot of those. On the Intel business side it’s i3s and i5s.
These are covered by whatever Dell / HP / Lenovo have too much of at the time of order, really. I've seen them all and with no apparent reason for the difference in configuration. And they all perform the same.

Funny thing is, for atual business desktop use, they all have overpowered GPUs!
In 18 years of working for fortune 100s it was only recently when I got to a high enough level to be able to order higher end hardware.
If I asked one of our PMs over the last decade what 'higher end' meant, they'd come up with something along the lines of whatever Xeon workstation the vendor of the month was peddling.

Literally nothing to do with what's inside, which is why we have i7's with 16GB of RAM booting off of spinners. PMs have never had a clue. Neither do software engineers really, which I find absolutely hysterical.

They all just order what 'sounds good'.
if we want to talk programmers, sure, we have a use case for more compute power. For example, earlier today at work I needed to generate 10 billion data points for a test - I just ssh’d to one of my 32 core / 64 thread Rome based VMs and launched my test. If I had a specific reason to run it locally, I would just fill out a form requesting a Xeon or a threadripper 3990x, but I really have no justification for it - the 32 core VMs work just fine for what I need,
32-core VMs should, but that's domain specific.
and the threadrippers come with a 3090 assuming you’re going to do some local ML work.
Which is... wild. Because those GPUs don't come cheap and unlike CPUs, require specially-formatted workloads to be of any use, which means that unless they're part of a farm with a dedicated workload, they're mostly just idling. Which is kind of the overall point, why buy GPUs for everything when the most common need is for video output?
So if most business cases are handled by an APU or using cloud based VMs, who is using the 5950? If it’s an enthusiast gaming, doing Video work or ML, they are going have a video card.
Gaming is literally the only straight consumer / professional workload that could require more CPU. It doesn't really, though, outside of sloppy / poorly optimized code, though that's probably not going away until IDEs get significantly smarter.

Doing video work on a GPU is more a function of convenience; GPUs are not the right tool for the job, they've just been the most accessible tool thus far.

A big part of that specific usecase has been that the codecs have evolved rapidly in the last decade. A bigger part is that today, that's more or less at a standstill, given that there's very little further utility to be gained by upping resolution of output devices (TVs, monitors, phones...). The codecs are now 'good enough', which means that the fixed-function hardware available is also good enough. See the Apple M1 running circles around less optimized hardware with many times the power draw. By many, I mean 50x or more (what's a TR + 3090 draw under load?).
So what’s left? Who is seeking out a 5800/5900/5950 over an APU that would use integrated video if it were included?
Well, by your own admission, you :)

If you do need actual CPU grunt, and that implies a workload that's either novel or so uncommon as to not being worth building custom circuitry for (anything from AVX up to a GPU-sized ASIC), then you're the target market. Even as a gamer, very little of what I do outside of gaming actually needs any real GPU grunt at all. As above, even the APUs and IGPs available today are massively overpowered for what we're typically using our computers for. CPUs too, really.

Not that I'd advocate 'going backward', but one way to think about it is this: the next iteration of the Raspberry Pi will almost certainly be fast enough for common desktop usage, business or otherwise. The current Pi 4 is almost enough, and its biggest hurdles have been software support for the hardware!
 
Well, by your own admission, you :)

If you do need actual CPU grunt, and that implies a workload that's either novel or so uncommon as to not being worth building custom circuitry for (anything from AVX up to a GPU-sized ASIC), then you're the target market. Even as a gamer, very little of what I do outside of gaming actually needs any real GPU grunt at all. As above, even the APUs and IGPs available today are massively overpowered for what we're typically using our computers for. CPUs too, really.

Not that I'd advocate 'going backward', but one way to think about it is this: the next iteration of the Raspberry Pi will almost certainly be fast enough for common desktop usage, business or otherwise. The current Pi 4 is almost enough, and its biggest hurdles have been software support for the hardware!
Huge snip - if I were writing from a desk top I’d quote point by point.

I’m not there target audience of a GPU integrated 5950 - I have discreet GPUs in all my systems and would just turn off the integrated video. When I’m not gaming on my top end systems they are running Boinc or folding at home, and from last years experience, you don’t want to run F@H on an APU GPU. My A10-7870k burned about 300watts with 4 Rosetta@home Boinc tasks and the integrated GPU running folding at home. My 1080ti is orders of magnitude more efficient in points per watt. In eco mode, the 5950 is more power efficient in points per watt than my power optimized raspberry pi 4s as well.

For “high end” hardware, I now have the option of ordering a Mac. I can either order a ~$750 laptop that is basically a lottery from the vendor of the week, or a MacBook Pro that is basically equipped with highest end option offered on the MacBook Pro across the board. No middle ground or anything like that - just choose between the two. It’s kind of funny in a not so amusing way.
 
I’m not there target audience of a GPU integrated 5950 - I have discreet GPUs in all my systems and would just turn off the integrated video. When I’m not gaming on my top end systems they are running Boinc or folding at home
You specifically aren't, but you have to admit you're a bit of an edge case there!
and from last years experience, you don’t want to run F@H on an APU GPU. My A10-7870k burned about 300watts with 4 Rosetta@home Boinc tasks and the integrated GPU running folding at home.
Those were inefficient when they were still on the drawing board...
My 1080ti is orders of magnitude more efficient in points per watt. In eco mode, the 5950 is more power efficient in points per watt than my power optimized raspberry pi 4s as well.
By the gods I'd hope so- the Pi 4 is using some ancient technology!
For “high end” hardware, I now have the option of ordering a Mac. I can either order a ~$750 laptop that is basically a lottery from the vendor of the week, or a MacBook Pro that is basically equipped with highest end option offered on the MacBook Pro across the board. No middle ground or anything like that - just choose between the two. It’s kind of funny in a not so amusing way.
Ah, management, and all the ways it's done poorly to the point of bordering on legal negligence.

I swear that there's an actual computing revolution just waiting to happen that will take much of the potential mismanagement seen today out of the loop. But as I've been waiting for about thirty years, I no longer find myself qualified to posit as to the immediacy of such a revolution :D
 
To deviate a bit off topic, a power optimized pi 4 (all LEDs off. Nic set to 100mbit, cpu voltage reduced as far as you can, hdmi / local video off, and a few other tweaks) on a trimmed version of raspbian lite, is more power efficient in points per watt than basically any Intel processor I’ve tested (9900k, 7600k, 6100, and older junk), and they are more power efficient that a 2700x. They biggest problem / weakness is that 5v power bricks are fairly inefficient - good luck finding anything better than 75% per efficient. Additionally, you need a dozen or more pi 4s to generate the points of the newer systems making management an issue.
 
Corporate desktops, small form factor PCs, multi-monitor output controllers, etc.
Corporate desktops are fine with 2 cores - hyper threading optional. I shared an office with the pc repair guy back in 2015 - for accounting, sales, hr and the call center people, they decided to stop buying new desktops in 2009 and stuck with the core 2 duos they bought in 2006. In 2010-11 he started upgrading them to 16GB of ram with a small ssd. They were still using them in 2018, and I wouldn’t be shocked if they were still in use today.

what on earth would you need 16 cores for, for outlook, word and excel?

for workstations that needed multi monitors, he was using some hp usb breakout box - it had a dvi and a display port for connectivity, and you could use both at once yielding a 3 monitor setup.
 
Corporate desktops are fine with 2 cores - hyper threading optional. I shared an office with the pc repair guy back in 2015 - for accounting, sales, hr and the call center people, they decided to stop buying new desktops in 2009 and stuck with the core 2 duos they bought in 2006. In 2010-11 he started upgrading them to 16GB of ram with a small ssd. They were still using them in 2018, and I wouldn’t be shocked if they were still in use today.

what on earth would you need 16 cores for, for outlook, word and excel?

Our users rarely have a single program running. Sure, a dual core system CAN run Excel today. But, not well. Even beyond that when they have Outlook open, Chrome/Edge with a dozen tabs, a Teams meeting running, two or three Excel files, a Word doc, a couple of PPT files, and SPSS/SASS even todays quad and hepta cores drag.
 
Our users rarely have a single program running. Sure, a dual core system CAN run Excel today. But, not well. Even beyond that when they have Outlook open, Chrome/Edge with a dozen tabs, a Teams meeting running, two or three Excel files, a Word doc, a couple of PPT files, and SPSS/SASS even todays quad and hepta cores drag.
That sounds like too little ram as opposed to too little cpu
 
Our default configuration deployed right now are i5-10600K with 32Gb RAM and 512GB SSDs. The i5 is choking.
Sounds like something is wrong with the image? Doesn’t match my experience at all. Zoom, word, excel, IntelliJ, Sql developer, power point, a dozen ssh sessions, chrome Firefox, and crash plan all running is like 30% cpu.
 
Sounds like something is wrong with the image? Doesn’t match my experience at all. Zoom, word, excel, IntelliJ, Sql developer, power point, a dozen ssh sessions, chrome Firefox, and crash plan all running is like 30% cpu.

I can watch SPSS go higher than that. LOL.
 
I just checked with my buddy back at the last job. In 2019 they replaced all the core 2 duos with i5 9400s w/ 16GB ram and 256gb ssds. He said it’s a total waste, as they average about 2% cpu.
 
I just checked with my buddy back at the last job. In 2019 they replaced all the core 2 duos with i5 9400s w/ 16GB ram and 256gb ssds. He said it’s a total waste, as they average about 2% cpu.

Just goes to show that different people have different use cases. My office PC would benefit greatly from a 7000 series 16 core chip with associated infrastructure that it comes with.
 
Just goes to show that different people have different use cases. My office PC would benefit greatly from a 7000 series 16 core chip with associated infrastructure that it comes with.
What do you think the percentage of business use cases are that could use that much power locally that wouldn’t also have a video card? Once you account for “could run on a core 2 duo” roles like secretary, data entry, call center, accounting, hr, and manager usage, we are probably talking 75%+ of the market? Another 20% that needs high power machines can probably more cost effectively run a core 2 duo and then RDP or ssh to a VMs on dense hypervisor. That leaves a market slice of probably less than 5%?
 
What do you think the percentage of business use cases are that could use that much power locally that wouldn’t also have a video card? Once you account for “could run on a core 2 duo” roles like secretary, data entry, call center, accounting, hr, and manager usage, we are probably talking 75%+ of the market? Another 20% that needs high power machines can probably more cost effectively run a core 2 duo and then RDP or ssh to a VMs on dense hypervisor. That leaves a market slice of probably less than 5%?

Most in fact don't require anything better than an APU. Looking at our org they do benefit from more core counts and more system memory, but the last thing they need is a dedicated GPU. We are not that unusual either and we have ~8,000 systems at this one location out of 5 locations.
 
I just checked with my buddy back at the last job. In 2019 they replaced all the core 2 duos with i5 9400s w/ 16GB ram and 256gb ssds. He said it’s a total waste, as they average about 2% cpu.
Those are going to last another five to ten years though, and there's very little reason to think that they won't continue to perform (beyond hardware failures). Not a bad investment really since parts seem to wear out less and less; at worst you'd see power supplies degrade too much or fail or fans fail, and those are all generally easy fixes.
What do you think the percentage of business use cases are that could use that much power locally that wouldn’t also have a video card?
Same question again, perhaps rephrased: why does everyone need GPU grunt? CPU-intensive workloads and GPU-intensive workloads don't have much crossover. Yeah, I'm not going to argue that such systems would probably come with a GPU, but it's not because they're actually needed; it's either a logistical reason, a contractual reason, or a mismanagement reason as mentioned above :)
 
Those are going to last another five to ten years though, and there's very little reason to think that they won't continue to perform (beyond hardware failures). Not a bad investment really since parts seem to wear out less and less; at worst you'd see power supplies degrade too much or fail or fans fail, and those are all generally easy fixes.

Same question again, perhaps rephrased: why does everyone need GPU grunt? CPU-intensive workloads and GPU-intensive workloads don't have much crossover. Yeah, I'm not going to argue that such systems would probably come with a GPU, but it's not because they're actually needed; it's either a logistical reason, a contractual reason, or a mismanagement reason as mentioned above :)
My postulation for the AMD stack.

the Venn diagram of “cpu load requires 8-16 unvirtuized CPU and “GPU workload exceeds integrated video” is a near perfect circle.

If you need all the CPU you can get, and it has to be local, you’re probably thinking threadripper 59X0. If you’re needing all the memory you can get, it’s threadripper pro time. In both those cases, I could see value integrated video.

this is actually likely to continue for the Ryzen stack at the bottom end - if someone only needs 8 CPU or less with minimal video, there is an APU for that.

the market segment that needs integrated video and more CPU than the top end APU, and needs less CPU than the bottom tier threadripper has to be so small it must be near 0.
 
Become a Patron!
Back
Top