Global PC Shipments Expected to Recover in 2024 after “Unprecedented Slump” in PC Demand: IDC

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,871
Points
113
With shipments totaling 68.5 million units in the third quarter of 2023, the global PC market managed to outperform expectations. But that volume was still down 7.2% compared to the same quarter in 2022, according to the International Data Corporation (IDC) Worldwide Quarterly Personal Computing Device Tracker.

See full article...
 
It's because for 90% of users the use case for a PC is to check email and browse the web. Phone's and tablets have supplanted that for that market. The remaining market are decidedly tiny but spend a **** ton of money for their compute. Desktops worth 3k, laptops 2k or more... AND cellphones and in many cases tablets.. SCREENS GALORE!

If you added in consoles... the market would be much bigger. ;)
 
There was unprecedented demand for PC hardware for gaming during the COVID-19 lockdowns. Naturally, the demand will pick up in 2024 and 2025 as the hardware purchased during that time frame ages out. So this isn't a surprise to me. I don't think the demand will reach what it did during the early days of the lockdowns but its going to be stronger than it has been since 2021 or so.
 
If you bought, say, a 5900 in late 2020 / early 21, there really isn’t a super big reason to upgrade. A lot of the people I helped upgrade were coming from Sandy Bridge type platforms. They are set till like 2030+
 
They are set till like 2030+
Nonsense. That's really not something you can predict. Not only that but the market isn't what it was during the Sandy Bridge days which in itself is an outlier compared to the 30 year slice of computing history I've been dealing with these things. We've never seen that level of stagnation for such a protracted period of time and with AMD actually competing now, things have been more exciting than they were in the Sandy Bridge era.
 
Nonsense. That's really not something you can predict. Not only that but the market isn't what it was during the Sandy Bridge days which in itself is an outlier compared to the 30 year slice of computing history I've been dealing with these things. We've never seen that level of stagnation for such a protracted period of time and with AMD actually competing now, things have been more exciting than they were in the Sandy Bridge era.
Are you expecting a new engine for WoW? Counterstrike? Is zoom going to need more power than a 5900 (DnD Beyond over Zoom is a thing)? Are PS 6 ports going to need more than a 5900? Id be shocked if any of those guys need a performance bump to play the stuff they do.
 
Id be shocked if any of those guys need a performance bump to play the stuff they do.
If you're only playing games from this generation or older sure. But with newer generations of games and hardware and programs in general you are going to need more compute. And that will see more support for what drives that compute demand.

guess what that is going to be? it will be AI. Ai is going to supplant gaming as the driver for high end home compute.
 
Are you expecting a new engine for WoW? Counterstrike? Is zoom going to need more power than a 5900 (DnD Beyond over Zoom is a thing)? Are PS 6 ports going to need more than a 5900? Id be shocked if any of those guys need a performance bump to play the stuff they do.
Evidently, you are unaware that other games have been released over the last 20 years and that new games are released throughout any given year. It should then come as a surprise that more demanding games are in development that will require more than the hardware we have available to us today, to be enjoyed at maximum settings.

People play other games besides WoW and Counterstrike.
 
Last edited:
People play other games besides WoW and Counterstrike.
Some people do, yes. The guys I helped build machines for in late 20 / early 21 don’t usually play anything else. Something like Baulder’s Gate 3 is right up their alley though. And 10 to 1, those old 2700Ks that I replaced would play BG3 just fine. I don’t know what you expect to come out in the next 7 years, but I just don’t see the minimum requirements eclipsing a 5900.
 
guess what that is going to be? it will be AI. Ai is going to supplant gaming as the driver for high end home compute.
I am skeptical on this.

It's the hot buzzword right now - but how much local AI computing will the typical household do?

AI computing, apart from minor tasks like biometrics and other "fuzzy" algorithms, I think is going to stay in the arena of data centers. The main thing with AI isn't in running the algorithm - that doesn't take too much at all, it's in the training of the models. And there, the bigger the better, and even on huge datacenters today takes weeks/months/years.

The quality of training is what will differentiate products and drive competition. To try to compete with someone with global datacenters on training quality is ... not going to happen with local resources. The only reasons I can think for anyone to even attempt to try local training is if they want to ensure a pure dataset or are experimenting - and then they need to be willing to accept weeks/months/years of time invested at significant expense.
 
To try to compete with someone with global datacenters on training quality is ... not going to happen with local resources
Couldn't you do something akin to folding@home to get big datasets from smaller users?
 
I am skeptical on this.

It's the hot buzzword right now - but how much local AI computing will the typical household do?

...

I'm generally skeptical about the latest buzzword and predictions of unparalleled success in whatever market segment. I toss "AI" in the same metaphorical box as "VR" and "Linux"; sure, some people do it, but despite all the proclamations that they will be the next big thing, they persistently remain edge cases.
 
I'm generally skeptical about the latest buzzword and predictions of unparalleled success in whatever market segment. I toss "AI" in the same metaphorical box as "VR" and "Linux"; sure, some people do it, but despite all the proclamations that they will be the next big thing, they persistently remain edge cases.
Ai integration is growing and offloading a traied aincomputento local resources will be a better experience for end users. The use case is already here it's just not readily visible.
 
Ai integration is growing and offloading a traied aincomputento local resources will be a better experience for end users. The use case is already here it's just not readily visible.
When AI is fully and seamlessly integrated, and not used as a marketing badge, then my skepticism will relax. As long as it continues to be used to sell and ship more units, I will remain skeptical. No amount of fanboy-ism (aka grassroots marketing) will change that.

YMMV
 
Look at what office 365 does with copilot. Also windows 11 patch 23w2 includes copilot for windows.
 
Some people do, yes. The guys I helped build machines for in late 20 / early 21 don’t usually play anything else. Something like Baulder’s Gate 3 is right up their alley though. And 10 to 1, those old 2700Ks that I replaced would play BG3 just fine. I don’t know what you expect to come out in the next 7 years, but I just don’t see the minimum requirements eclipsing a 5900.
So because you built machines to play World of Warcraft and Counterstrike and because you think a 2700K's could technically play BG3, you can't see any reason to upgrade past a Ryzen 9 5900X? Got it. First off, Baulder's Gate 3's minimum requirements specify a Core i5 4690. That's already a couple of generations beyond your Core i7 2700X example.

In any case, you seem to have a myopic viewpoint on this. Speaking of minimum requirements, playing games on systems only meeting minimum requirements is something you can do, but not something most people want to do if they don't have to. It doesn't usually provide a good gaming experience. Though someone who only plays 20 year old games probably isn't going to understand that I guess. Newer games will come out over the next 7 years that may not specify more than a 5900X as a minimum requirement but that doesn't mean that more powerful processors aren't desirable or even necessary.

We already have games out today that run faster on newer CPU's than the 5900X. There are people with wants and needs that are different from your own.
 
So because you built machines to play World of Warcraft and Counterstrike and because you think a 2700K's could technically play BG3, you can't see any reason to upgrade past a Ryzen 9 5900X? Got it. First off, Baulder's Gate 3's minimum requirements specify a Core i5 4690. That's already a couple of generations beyond your Core i7 2700X example.

In any case, you seem to have a myopic viewpoint on this. Speaking of minimum requirements, playing games on systems only meeting minimum requirements is something you can do, but not something most people want to do if they don't have to. It doesn't usually provide a good gaming experience. Though someone who only plays 20 year old games probably isn't going to understand that I guess. Newer games will come out over the next 7 years that may not specify more than a 5900X as a minimum requirement but that doesn't mean that more powerful processors aren't desirable or even necessary.

We already have games out today that run faster on newer CPU's than the 5900X. There are people with wants and needs that are different from your own.
I can’t tell if you’re intentionally missing the larger point or just so focused on the enthusiast aspect you can’t see it.

Expecting to see a large recovery in 2024 due to upgrades that occurred in 2020 / 2021 Is misrepresenting the upgrade cycle of a lot of people. Of the 9 builds I did during that time frame, only myself and one other upgrade every release cycle or every other cycle.

The builds I did were for my college buds. After we wrapped up school and started our first jobs, the bulk of the crew had me build Athlon 64s so they had fresh builds so we could all play WoW. 3 upgraded to i7 920s but most kept those Athlon 64s (with an intermediate video card upgrade) until Sandy bridge. And the Sandy bridge systems got video card updates to things like the 970 and got SSDs, but otherwise lived until the 5900s. Maybe after everyone’s kids are in college the upgrade cycles will speed up again, but I’m just not convinced that 95% of people upgrade more frequently than 8+ years.
 
I can’t tell if you’re intentionally missing the larger point or just so focused on the enthusiast aspect you can’t see it.

Expecting to see a large recovery in 2024 due to upgrades that occurred in 2020 / 2021 Is misrepresenting the upgrade cycle of a lot of people. Of the 9 builds I did during that time frame, only myself and one other upgrade every release cycle or every other cycle.
I simply stated that it was a factor. Industry analysts predict an increase in demand in 2024. I simply provided one of the reasons I believe this is the case. I am not saying that the reason I gave is the only reason why, but a lot of enthusiasts made purchases during 2020 and 2021 as we saw availability of hardware dry up and scalping become a bigger problem than its ever been. We even saw enthusiasts purchasing entire machines just to get GPU's (and part out the rest) which definitely inflated the global PC sales statistics. In 2024, those machines will be three to four years old. For enthusiasts, that's a long time.

The growth of PC shipments is increasing due to an increase in popularity of PC gaming. Here is an article about that showing how dedicated gamers replace their hardware every 3.3 years on average. Now, for the sake of argument I'll give you a 3-5 year window on replacing systems. (I'll talk about that a bit more later on.) If you look at the sales by quarter figures I posted above, you'll note that 2019 and 2021 were bigger years for sales than normal meaning that whether people upgrade at 3 or even 5 year intervals, 2024 is the time to do it. Furthermore, the idea that you should upgrade every three to five years is decades old and generally reigns true if you want to run the most up to date and modern software and run it well. Emphasis on running it well.

If you ever asks anyone in retail how often they should buy a new computer, that's generally the answer nearly anyone will give you. Businesses have been using 3-5 year cycles for deprecation calculations for decades. That's not necessarily how fast they replace systems but it often is as business customers generally like everything to be under warranty while its in service. 1-3 year warranties (extended warranties from retailers) are also common for consumer PC's as well. That fact is key as most people don't turn their own screwdrivers when servicing systems.

Not only that, but when laptops outpaced desktop shipments, it compounded issues with repairs leading to early replacements of entire systems. Outside of warranty, after 3 years if a laptop has a major failure such as a bad screen or motherboard its often more cost effective to simply replace the machine outright. Not every upgrade or new PC purchase comes about as a result of customers being unsatisfied with how their system performs.

This does happen with desktops too as sometimes customers will be steered towards replacing their systems if repairs cost more than a certain percentage of what a new system does. This is especially common when customers are spending around $1,200 to $1,500 on gaming machines. If you see a $600+ repair bill on a nearly three year old system that only cost you $1,500 to begin with, chances are you'll spend more and get a whole new machine. I've had just about every variation of this discussion with customers that you can imagine. While this is a lot less likely with DIY and boutique systems, more conventional OEMs fall prey to this a lot as parts availability dries up and costs for parts outside of warranty become extremely costly. It's also not just parts, but labor that comes into play. Retailer such as Best Buy with tech shops (such as they are) are even incentivized to push customers towards new systems in some cases. This is especially true when systems are older than three years.

Keep in mind you could be paying $60 to $100 an hour in labor charges. Again, people who can build and repair their own systems are not the norm. $3,000 plus systems are not the norm. If you build a super expensive system, chances are it will last longer and you can go longer in between upgrades and still get good performance for years. However, that's not what the average consumer does. They buy low-mid range and those systems have shorter service lives than higher end systems typically do.
The builds I did were for my college buds. After we wrapped up school and started our first jobs, the bulk of the crew had me build Athlon 64s so they had fresh builds so we could all play WoW. 3 upgraded to i7 920s but most kept those Athlon 64s (with an intermediate video card upgrade) until Sandy bridge. And the Sandy bridge systems got video card updates to things like the 970 and got SSDs, but otherwise lived until the 5900s. Maybe after everyone’s kids are in college the upgrade cycles will speed up again, but I’m just not convinced that 95% of people upgrade more frequently than 8+ years.
I disagree. I base this from the data posted above, but also on my experiences in the industry. I've had conversations with motherboard manufacturers, CPU makers and I've even worked in retail stores and repair centers. I've been building systems professionally, personally and for friends for 30 years. I've built hundreds of systems. I'll go out on a limb and say I probably have a much larger sampling of data to go on than most people do. I've discussed the market with these companies and talked with them about what they've seen. They've spent small fortunes in research on market trends and customer behavior.

The Sandy Bridge era was unique in that it was so stagnant as to keep people on their older hardware for longer because it made little sense to upgrade. You can't use that period of time as a basis for saying that people only upgrade every 8+ years because that's what your friends do or because those 2700K's were viable for so long. If you look at the larger market in the years that bracket this era, you'll quickly realize what an anomaly that time frame was. It shouldn't be the basis for any predictions on how the market will perform or how often people upgrade or replace their computers. It's the exception, not the rule. That's not to say it couldn't happen again. But when you look at the factors that contributed towards that era, we aren't there right now.

Since you brought up economics, I'll address that. College aged people who recently graduated or even graduated a couple years ago are just getting started in their professional lives. They aren't likely to have the disposable income to replace entire computer systems every 3 years or so. People also don't necessarily wait until their kids are grown either. Typically, you will have more disposable income over age 30 and up than you do in your early 20's. Again, you and your circle of friends are not representational of the market as a whole.

Your comments indicate a very narrow viewpoint and as you've continued to clarify your position I can't help but think you are basing your thoughts on your own circle of friends which isn't indicative of the larger market as I've shown. Sure, there are people that will only play old games or only upgrade every 8 to 10 years but that's not the cycle for the average gamer. Like Moore's law, there are exceptions but the fact is that over the past 3 decades, most people have replaced their computers every three to five years.

There are many reasons for this as I've outlined above. I'll go on to add that people who play games that aren't two decades old will start to find that their rigs will struggle to max out settings or give them decent frame rates after 3 years or so. Again, most people don't want to simply meet the minimum system requirements of their games. The fact that your friends might not have broad tastes in games or upgrade very often isn't enough of a basis to make a prediction about the entire industry or what is or is not average for the market.
 
Last edited:
Become a Patron!
Back
Top