Endgame
Semi-regular
- Joined
- May 30, 2020
- Messages
- 868
- Points
- 93
The business desktop use case is covered by the APU - AMD has a lot of those. On the Intel business side it’s i3s and i5s. In 18 years of working for fortune 100s it was only recently when I got to a high enough level to be able to order higher end hardware.Why?
Again, why?
Well, it's integrated, but why does it have to be 'cheap'?
Hard to count the number of i7 systems I find in corporate desktops / laptops that have no GPU. Many are to the point of being little larger than NUCs, supporting half-height, half-length cards if anything at all.
And the way I approach the question is this: are there workloads that use tons of CPU but don't need graphics capability beyond supporting desktop output?
The answer to that is very clearly yes, and at every tier of CPU even.
Granted most folks don't actually need a whole lot of CPU grunt. See the success of Apple's M1; the CPU is quite inferior as a 'CPU', but as Apple and many others long realized, most of what consumers use CPUs for is actually to fill in for the lack of affordable fixed-function hardware availability for whatever it is they're trying to do beyond writing an email or entering data into a spreadsheet. Welp, the M1 does all that well, and nearly anything else that a consumer needs to do, in 1/4 the power envelope (or far less, in the case of video editing!).
Point being, if the workload actually calls for a CPU, then a CPU is needed. Not a GPU. So why make folks buy trash GPUs just for video output? Ever? Why bother making them at all? Why not just make sure that every CPU block (whatever the smallest discrete die is in a CPU package) has the ability to run a few displays out of the USB4 ports?
Realistically, 'CPUs' should become a thing of the past. They should just be 'processors'. Whether full SoC as seen in the 'end user' class or just a processing block on a larger package as seen in the workstation- and server-class, a few video cores that can handle output, streaming, cleaning up signals with ML, and perhaps run a low-end or old game aren't going to hurt. If they're ubiquitous, they might even start seeing more use and more standardization!
i think we can rule out the 5900 and 5950 from ever being used as a desktop in data entry or call center positions.
if we want to talk programmers, sure, we have a use case for more compute power. For example, earlier today at work I needed to generate 10 billion data points for a test - I just ssh’d to one of my 32 core / 64 thread Rome based VMs and launched my test. If I had a specific reason to run it locally, I would just fill out a form requesting a Xeon or a threadripper 3990x, but I really have no justification for it - the 32 core VMs work just fine for what I need, and the threadrippers come with a 3090 assuming you’re going to do some local ML work.
So if most business cases are handled by an APU or using cloud based VMs, who is using the 5950? If it’s an enthusiast gaming, doing Video work or ML, they are going have a video card. So what’s left? Who is seeking out a 5800/5900/5950 over an APU that would use integrated video if it were included?