Right now we in many segments are in the turbo charged bigblock era of computing and gpu computing. A lot more effort needs to go into efficiency. But I guess when these segments just keep chewing up processing power like a joke, well you keep seeking more that than the other.
They're not necessarily inefficient - the target CPUs are likely the big server packages that have many individual compute dies. These in turn are more efficient at the varying tasks for which they've been designed, Intel is putting discrete application accelerators on separate dies in their CPUs alongside compute dies, than if they were to be placed in separate sockets or across the PCIe bus in an expansion card.
What this means is that Intel is able to scale up / scale out effectively moreso than that they're getting less efficient.
2000 watt PC's. 10 years ago I would have said never.
2000 watt servers... I have already.
Exactly
When we think about '2000W CPUs', I tend to start looking at what the future of computing might be. One of the bigger questions over the last decade has been, 'why would we need powerful computers at home?'. Aside from gaming and content creation, and those with counter examples already (Geforce Now etc., online media editors), what do we need 'computing resources' for at home?
The emergence of ChatGPT seems to answer that a bit. Assuming that compute resources continue to expand, and that the software behind these machine learning systems becomes more effective and more efficient itself, there does seem to be a place for at-home compute.
If you could run a 'personal ML matrix' at home, something that was secure enough that you could feed it all of your personal information / family information, and could use it as a genuine 'online assistant', wouldn't that have significant value?