My assumption is the target audience of the X950 CPUs are enthusiasts that will already have a video card.
Why?
Professionals that need CPU or bandwidth will opt for threadripper. Bang for the buck shoppers will go for the X900 series, but will also probably be running a mid range video card as well.
Again, why?
i just don’t know who the user is that wants 16 cores but also wants cheap integrated video.
Well, it's
integrated, but why does it have to be 'cheap'?
Hard to count the number of i7 systems I find in corporate desktops / laptops that have no GPU. Many are to the point of being little larger than NUCs, supporting half-height, half-length cards if anything at all.
And the way I approach the question is this: are there workloads that use tons of CPU but don't need graphics capability beyond supporting desktop output?
The answer to that is very clearly yes, and at every tier of CPU even.
Granted most folks don't actually need a whole lot of CPU grunt. See the success of Apple's M1; the CPU is quite inferior as a 'CPU', but as Apple and many others long realized, most of what consumers use CPUs for is actually to fill in for the lack of affordable fixed-function hardware availability for whatever it is they're trying to do beyond writing an email or entering data into a spreadsheet. Welp, the M1 does all that well, and nearly anything else that a consumer needs to do, in 1/4 the power envelope (or far less, in the case of video editing!).
Point being, if the workload actually calls for a
CPU, then a CPU is needed. Not a GPU. So why make folks buy trash GPUs just for video output? Ever? Why bother making them at all? Why not just make sure that every CPU block (whatever the smallest discrete die is in a CPU package) has the ability to run a few displays out of the USB4 ports?
Realistically, 'CPUs' should become a thing of the past. They should just be 'processors'. Whether full SoC as seen in the 'end user' class or just a processing block on a larger package as seen in the workstation- and server-class, a few video cores that can handle output, streaming, cleaning up signals with ML, and perhaps run a low-end or old game aren't going to hurt. If they're ubiquitous, they might even start seeing more use and more standardization!