I can’t tell if you’re intentionally missing the larger point or just so focused on the enthusiast aspect you can’t see it.
Expecting to see a large recovery in 2024 due to upgrades that occurred in 2020 / 2021 Is misrepresenting the upgrade cycle of a lot of people. Of the 9 builds I did during that time frame, only myself and one other upgrade every release cycle or every other cycle.
I simply stated that it was a factor. Industry analysts predict an increase in demand in 2024. I simply provided one of the reasons I believe this is the case. I am not saying that the reason I gave is the only reason why, but a lot of enthusiasts made purchases during 2020 and 2021 as we saw availability of hardware dry up and scalping become a bigger problem than its ever been. We even saw enthusiasts purchasing entire machines just to get GPU's (and part out the rest) which definitely inflated the global PC sales statistics. In 2024, those machines will be three to four years old. For enthusiasts, that's a long time.
The growth of PC shipments is increasing due to an increase in popularity of PC gaming.
Here is an article about that showing how dedicated gamers replace their hardware every 3.3 years on average. Now, for the sake of argument I'll give you a 3-5 year window on replacing systems.
(I'll talk about that a bit more later on.) If you look at the sales by quarter figures I posted above, you'll note that 2019 and 2021 were bigger years for sales than normal meaning that whether people upgrade at 3 or even 5 year intervals, 2024 is the time to do it. Furthermore, the idea that you should upgrade every three to five years is
decades old and generally reigns true if you want to run the most up to date and modern software and run it well. Emphasis on running it well.
If you ever asks anyone in retail how often they should buy a new computer, that's generally the answer nearly anyone will give you. Businesses have been using 3-5 year cycles for deprecation calculations for decades. That's not necessarily how fast they replace systems but it often is as business customers generally like everything to be under warranty while its in service. 1-3 year warranties
(extended warranties from retailers) are also common for consumer PC's as well. That fact is key as most people don't turn their own screwdrivers when servicing systems.
Not only that, but when laptops outpaced desktop shipments, it compounded issues with repairs leading to early replacements of entire systems. Outside of warranty, after 3 years if a laptop has a major failure such as a bad screen or motherboard its often more cost effective to simply replace the machine outright. Not every upgrade or new PC purchase comes about as a result of customers being unsatisfied with how their system performs.
This does happen with desktops too as sometimes customers will be steered towards replacing their systems if repairs cost more than a certain percentage of what a new system does. This is especially common when customers are spending around $1,200 to $1,500 on gaming machines. If you see a $600+ repair bill on a nearly three year old system that only cost you $1,500 to begin with, chances are you'll spend more and get a whole new machine. I've had just about every variation of this discussion with customers that you can imagine. While this is a lot less likely with DIY and boutique systems, more conventional OEMs fall prey to this a lot as parts availability dries up and costs for parts outside of warranty become extremely costly. It's also not just parts, but labor that comes into play. Retailer such as Best Buy with tech shops
(such as they are) are even incentivized to push customers towards new systems in some cases. This is especially true when systems are older than three years.
Keep in mind you could be paying $60 to $100 an hour in labor charges. Again, people who can build and repair their own systems are not the norm. $3,000 plus systems are not the norm. If you build a super expensive system, chances are it will last longer and you can go longer in between upgrades and still get good performance for years. However, that's not what the average consumer does. They buy low-mid range and those systems have shorter service lives than higher end systems typically do.
The builds I did were for my college buds. After we wrapped up school and started our first jobs, the bulk of the crew had me build Athlon 64s so they had fresh builds so we could all play WoW. 3 upgraded to i7 920s but most kept those Athlon 64s (with an intermediate video card upgrade) until Sandy bridge. And the Sandy bridge systems got video card updates to things like the 970 and got SSDs, but otherwise lived until the 5900s. Maybe after everyone’s kids are in college the upgrade cycles will speed up again, but I’m just not convinced that 95% of people upgrade more frequently than 8+ years.
I disagree. I base this from the data posted above, but also on my experiences in the industry. I've had conversations with motherboard manufacturers, CPU makers and I've even worked in retail stores and repair centers. I've been building systems professionally, personally and for friends for 30 years. I've built hundreds of systems. I'll go out on a limb and say I probably have a much larger sampling of data to go on than most people do. I've discussed the market with these companies and talked with them about what they've seen. They've spent small fortunes in research on market trends and customer behavior.
The Sandy Bridge era was unique in that it was so stagnant as to keep people on their older hardware for longer because it made little sense to upgrade. You can't use that period of time as a basis for saying that people only upgrade every 8+ years because that's what your friends do or because those 2700K's were viable for so long. If you look at the larger market in the years that bracket this era, you'll quickly realize what an anomaly that time frame was. It shouldn't be the basis for any predictions on how the market will perform or how often people upgrade or replace their computers. It's the exception, not the rule. That's not to say it couldn't happen again. But when you look at the factors that contributed towards that era, we aren't there right now.
Since you brought up economics, I'll address that. College aged people who recently graduated or even graduated a couple years ago are just getting started in their professional lives. They aren't likely to have the disposable income to replace entire computer systems every 3 years or so. People also don't necessarily wait until their kids are grown either. Typically, you will have more disposable income over age 30 and up than you do in your early 20's. Again, you and your circle of friends are not representational of the market as a whole.
Your comments indicate a very narrow viewpoint and as you've continued to clarify your position I can't help but think you are basing your thoughts on your own circle of friends which isn't indicative of the larger market as I've shown. Sure, there are people that will only play old games or only upgrade every 8 to 10 years but that's not the cycle for the
average gamer. Like Moore's law, there are exceptions but the fact is that over the past 3 decades, most people have replaced their computers every three to five years.
There are many reasons for this as I've outlined above. I'll go on to add that people who play games that aren't two decades old will start to find that their rigs will struggle to max out settings or give them decent frame rates after 3 years or so. Again, most people don't want to simply meet the minimum system requirements of their games. The fact that your friends might not have broad tastes in games or upgrade very often isn't enough of a basis to make a prediction about the entire industry or what is or is not average for the market.