GPU Prices Are Seemingly Dropping

David_Schroth

Administrator
Staff member
Joined
Apr 23, 2019
Messages
1,213
Points
113
nvidia-geforce-rtx-30-series-graphics-cards-lined-up-1024x576.jpg
Image: NVIDIA



Tech outlets have shared some hopium for enthusiasts who are still waiting for graphics cards to fall back to sane levels.



One dose comes from Tom’s Hardware, which has shared data alluding to positive price changes for both NVIDIA and AMD graphics cards on auction sites. Comprising one week’s worth of pricing data from late December to the past week from eBay, the numbers suggest that selling prices for GeForce RTX and Radeon RX cards have declined by as much as 11.6 percent on average.



Late December vs. Late January GPU Pricing (eBay)



GPU ModelDecember 20th-27th Avg PriceGPUs SoldJanuary 17th-24th Avg PriceGPUs SoldPrice ChangeQuantity ChangeRTX 3090$2,829.39312$2,550.86336-9.8%7.7%RTX 3080 Ti$1,951.33247$1,863.94241-4.5%-2.4%RTX 3080 12GBN/A0$1,681.229N/AN/ARTX 3080 10GB$1,804.28416$1,595.41311-11.6%-25.2%RTX 3070...

Continue reading...


 
"And Leon's getting larrrrggggerrrrr"
Leon's getting larger.gif

Sorrry... Was the first thought that came to my mind. Wake me when prices get close to reasonable..
 
I noticed this just the other day.

Ever since GPU prices went crazy, my barometer card for comparison has been the 2060 Super I bought my stepson for $399 in 2019.

Every now and then I poke around on eBay and search sold listings to see what they are actually selling for.

At the height of the crazy prices I saw 2060 Super's like this one going for $850. A couple of days ago it looked like the average of the last several sales was in the $600-$650 range.

So, prices have started coming down, but they still have a long long way to go. It is till selling for significantly more than it did when it was new 3 years ago. At this time, the 2060 Super should probably be selling for about $200. Until we get there, we have not recovered.

And I don't think we will ever fully recover. At least not unless Intel's entrance into the market with their Arc Alchemist cards results in a price war.

Over the last two years the GPU companies and AIB's have learned that gamers are willing to pay WAY more for for video cards than they had previously ever imagined.

The best we can hope for are mid-range boards becoming more affordable, while high end cards (especially limited edition super binned, high OC versions) continuing to demand increasingly crazy sums of money for those nutrjob gamers who just have to have the best no matter what, and will fight others over it, because they think it impresses their subscribers.
 
Over the last two years the GPU companies and AIB's have learned that gamers are willing to pay WAY more for for video cards than they had previously ever imagined.
If the market was just gamers, then I'd agree, but it's much bigger than that before we even discuss miners. For an example, when JHH did the RTX 3090 reveal, they did talk about gaming - and they also talked quite a bit about making use of the VRAM for content creation.

Thing is, once manufacturers get used to meeting demand, whatever it is, prices will adjust.

while high end cards (especially limited edition super binned, high OC versions) continuing to demand increasingly crazy sums of money for those nutrjob gamers who just have to have the best no matter what, and will fight others over it, because they think it impresses their subscribers.
Honestly I just want a higher power limit. That seems to be the real performance ceiling.


Also - I was able to pick up a 3080 on my second Newegg Shuffle attempt. I could have just been stupidly lucky, but to me for it to have been that easy, perhaps supply really is picking up.
 
Wake me when GPU prices get to proper MSRP levels. MSRPs were already too high before crypto-plagues and chip shortages f*cked everything up. I had trouble accepting the MSRPs ($500 for a 2070 Super, $400 for a 5700 XT, $650 for Fury X/980 Ti, they have lost their g4wd-**** minds), so this current crazy market doesn't even seem like real life. Unfortunately, it is. And to think, I used to complain about $400-$450 for flagship cards (like Radeon 9700 Pro).
 
Wake me when GPU prices get to proper MSRP levels. MSRPs were already too high before crypto-plagues and chip shortages f*cked everything up. I had trouble accepting the MSRPs ($500 for a 2070 Super, $400 for a 5700 XT, $650 for Fury X/980 Ti, they have lost their g4wd-**** minds), so this current crazy market doesn't even seem like real life. Unfortunately, it is. And to think, I used to complain about $400-$450 for flagship cards (like Radeon 9700 Pro).
You say that, but it's not true. Factoring inflation and everything else, I don't think $1,000-$1,200 is unreasonable at all for a flagship card. Don't get me wrong, if I could get them for $600 or less I'd be absolutely thrilled by that. But that's neither been realistic nor the way things have been for a very long time.

When you look at GPU pricing objectively, GPU MSRP's aren't really all that different than they were 5 to 7 years ago. The Maxwell based Titan X cost $1,299 or $1,399. That's the Titan class card which is the entry level workstation/prosumer segment while still being more or less at the top of the gaming stack. Later Pascal based Titan X's were $1,200 MSRP. Today, the RTX 3090 FE has an MSRP of $1,499.99 in the United States. AIB cards and reference designs with various coolers are priced higher, but after 7 years, a $200-$300 increase at the time the RTX 3090 FE was released really wasn't all that outlandish.

It was only outrageous compared to the base RTX 3080, which wasn't all that much slower but had an MSRP of around half the RTX 3090. But again, that's not really all that strange as cards like the 980 Ti etc. were always considerably less than the Titan class cards. The one real anomaly here was the RTX 20 series, which was insanely priced and remains the most insanely priced series by MSRP. Titan V which was priced at $2,499 while the RTX 2080 Ti was $1,199 or something like that.

Guys like me used to buy Titan's or Ti cards in pairs doubling the cost of our GPU solutions. Honestly, the price of the RTX 3090 FE wasn't that big of a deal for what we were getting. It was certainly priced far better than it's predecessor, the Titan V.

Price scalping aside, if you ask me, the more egregious thing in the computer industry isn't GPUs but rather motherboard pricing. In 2015, the most expensive Z170 motherboard was $400 to $450. Now, we had HEDT boards upwards of around $500-$600 but you got a lot more for your money in terms of the platform. Today the most expensive mainstream motherboard is the ASUS ROG Maximus Z690 Extreme at $1,099. That's not getting into the limited production offerings with factory monoblocks which are understandably more expensive than that. However, the Maximus Z690 Extreme is still a mainstream part and its double the price of the most expensive part from 7 years ago.

When the X570 chipset released, we saw a massive price increase in the mainstream segment with the top end boards reaching highs of around $750 to $800. For the previous generation, we only saw pricing like that on the HEDT side. Now Z690 boards can cost upwards of $1,100. Monoblock boards are almost double that with the ASUS ROG Maximus Z690 Extreme Glacial costing $1,999.99. The midrange is creeping up as well.

Again, going by MSRP GPU's aren't really that much more than they've been for several years. Now, I can't speak for some of the mid-range and lower end offerings as I honestly don't look at them. I never have. Of course, actually getting GPU's for MSRP is another matter with actual street prices being all over the place. But motherboard's aren't being scalped. They are literally twice what they used to be a few years ago. Sure, there are economic and supply reasons for some of that increase. However, that doesn't account for the vast majority of the price hike.
 
You say that, but it's not true. Factoring inflation and everything else, I don't think $1,000-$1,200 is unreasonable at all for a flagship card. Don't get me wrong, if I could get them for $600 or less I'd be absolutely thrilled by that. But that's neither been realistic nor the way things have been for a very long time.

When you look at GPU pricing objectively, GPU MSRP's aren't really all that different than they were 5 to 7 years ago. The Maxwell based Titan X cost $1,299 or $1,399. That's the Titan class card which is the entry level workstation/prosumer segment while still being more or less at the top of the gaming stack. Later Pascal based Titan X's were $1,200 MSRP. Today, the RTX 3090 FE has an MSRP of $1,499.99 in the United States. AIB cards and reference designs with various coolers are priced higher, but after 7 years, a $200-$300 increase at the time the RTX 3090 FE was released really wasn't all that outlandish.

It was only outrageous compared to the base RTX 3080, which wasn't all that much slower but had an MSRP of around half the RTX 3090. But again, that's not really all that strange as cards like the 980 Ti etc. were always considerably less than the Titan class cards. The one real anomaly here was the RTX 20 series, which was insanely priced and remains the most insanely priced series by MSRP. Titan V which was priced at $2,499 while the RTX 2080 Ti was $1,199 or something like that.

Guys like me used to buy Titan's or Ti cards in pairs doubling the cost of our GPU solutions. Honestly, the price of the RTX 3090 FE wasn't that big of a deal for what we were getting. It was certainly priced far better than it's predecessor, the Titan V.

Price scalping aside, if you ask me, the more egregious thing in the computer industry isn't GPUs but rather motherboard pricing. In 2015, the most expensive Z170 motherboard was $400 to $450. Now, we had HEDT boards upwards of around $500-$600 but you got a lot more for your money in terms of the platform. Today the most expensive mainstream motherboard is the ASUS ROG Maximus Z690 Extreme at $1,099. That's not getting into the limited production offerings with factory monoblocks which are understandably more expensive than that. However, the Maximus Z690 Extreme is still a mainstream part and its double the price of the most expensive part from 7 years ago.

When the X570 chipset released, we saw a massive price increase in the mainstream segment with the top end boards reaching highs of around $750 to $800. For the previous generation, we only saw pricing like that on the HEDT side. Now Z690 boards can cost upwards of $1,100. Monoblock boards are almost double that with the ASUS ROG Maximus Z690 Extreme Glacial costing $1,999.99. The midrange is creeping up as well.

Again, going by MSRP GPU's aren't really that much more than they've been for several years. Now, I can't speak for some of the mid-range and lower end offerings as I honestly don't look at them. I never have. Of course, actually getting GPU's for MSRP is another matter with actual street prices being all over the place. But motherboard's aren't being scalped. They are literally twice what they used to be a few years ago. Sure, there are economic and supply reasons for some of that increase. However, that doesn't account for the vast majority of the price hike.
I was critical of the motherboard price increases back then, and I'm still sour on it now. Mostly the reason why I'm on the X470 platform still to this day. There was no real tangible difference I would feel in my day to day usage between X470 and X570 motherboards. PCIe4 would have been a "Nice" to have but it's not the end of the world if I didn't have it. The 5000 series CPU will be my last upgrade for a while until I see where my money will be best spent on the next platform migration.
 
Well the Asus hero is 729€ atm that is scalping territory as they were arpund 200€ cheaper at launch, (still a lot more expensive then previous gen Hero boards)

Guess they were less popular when there was no DDR5 RAM yet. GPU's are still crazy at over 2x MSRP still.

I hope that the next gen brings in some relief, why did I have to buy an 1440p ultrawide, silly me.
 
You say that, but it's not true. Factoring inflation and everything else, I don't think $1,000-$1,200 is unreasonable at all for a flagship card.
A) Inflation lately has been absolutely crazy, and is affecting everything. So there is that. But I don't think that accounts for price increases over the past couple of generations that GPUs have seen. Maybe you could make the case for Moore's Second Law coming into play.

B) I think price has to be relative to performance. Are today's flagships that much faster relative to the rest of the product line that previous generations were? I'm not comparing previous to next generation (generational/tech increase), I'm comparing inside of a given generation. You can make a seperate case for generational price/performance deltas, but that has to consider the entire lineup, not just flagship (more on that below).

C) Do you consider a card like the Titan to be a flagship in a given line or a separate-but-equal prosumer product line; and to follow up on that, a card like the 3090 to be a replacement for the Titan or Flagship for Turing?

I don't know the answer to B - it would be interesting to see some data, say, how a 2080Ti stacks up relative to other Pascal cards versus, say, a 1080Ti or a 980Ti.

Same with C, would be kinda the same as the previous statement only throw Titans into the mix.

Just some initial data:
780Ti - $699 MSRP
980Ti - $649 MSRP
1080Ti - $699 MSRP
2080Ti - $799 MSRP
3080Ti - $1,199 MSRP

I think I'm ok with Flagships going up as high as whatever people are willing to pay. I always have a budget and I look for the best I can that fits inside my budget. I don't necessarily need "Flagship", so top-tier cards and their prices are only relevant if they happen to fit inside that budget. But I do look for a certain amount of performance increase - I don't just upgrade because something is new: it has to be better than what I have, by a big enough margin, and fit inside my budget. Usually I look for cards that roughly double the current performance, and I wait until I can get that inside of about a $500 (+/-) budget. If that takes years, it takes years. At current pace, it may take decades... I have noticed it takes longer and longer to hit those metrics. It was 3 years. Then 4 years. Last one was 6 years and a big budget increase that I probably shouldn't have done.

What concerns me more is that we aren't seeing generational value increase. When I say value, I mean performance relative to price. Value in today's GPU market has actually been trending negatively -- it costs more to get the same performance today than it did 3-4 years ago. We have to reverse that trendline and get back to positive value, and then we have to sustain that positive value trend for a while just to get back to where we were with respect to performance per dollar spent.

And we could blame that on the current economy, except we started to see the value line degrade well before that -- when Turing came out, the prices started to edge up, but apart from RT/DLSS, rasterizing performance didn't move nearly as much. You could say the all-in value considering everything added to it, but you could also point to the number of gaming titles and such that supported it at the time and make a strong case that it doesn't add much, if anything at all (and you can still make that case with a straight face) -- the added value of the new tech is very much dependent on your willingness to futureproof and believe that tech would be utilized in the future efficiently. I'd also point out during this time AMD deprioritized graphics, particularly high end graphics, and that was very much a contributing factor: I don't hold either company blameless.

I would equate the value proposition of RT/DLSS/FSR to buying a game and hoping they fix it in the patches -- some people were happy with Cyberpunk when it shipped, some people think it's fine now after a few patches, and others are still waiting for it to be fixed: I see RT/DLSS in the same vein. Just like some people think RT/DLSS was totally worth it when Turing shipped, some people think there are enough titles out today to justify it, and others think it still isn't worth anything extra yet.
 
Last edited:
Wake me when GPU prices get to proper MSRP levels. MSRPs were already too high before crypto-plagues and chip shortages f*cked everything up. I had trouble accepting the MSRPs ($500 for a 2070 Super, $400 for a 5700 XT, $650 for Fury X/980 Ti, they have lost their g4wd-**** minds), so this current crazy market doesn't even seem like real life. Unfortunately, it is. And to think, I used to complain about $400-$450 for flagship cards (like Radeon 9700 Pro).

I hear you, but you also have to consider inflation.

That $400 - $450 in 2002 is equivalent to $630-$710 today.

Most years inflation is very low, only 2% or so, but year over year it adds up, and while it feels like only yesterday, 2002 was 20 years ago...
 
Price scalping aside, if you ask me, the more egregious thing in the computer industry isn't GPUs but rather motherboard pricing. In 2015, the most expensive Z170 motherboard was $400 to $450. Now, we had HEDT boards upwards of around $500-$600 but you got a lot more for your money in terms of the platform. Today the most expensive mainstream motherboard is the ASUS ROG Maximus Z690 Extreme at $1,099. That's not getting into the limited production offerings with factory monoblocks which are understandably more expensive than that. However, the Maximus Z690 Extreme is still a mainstream part and its double the price of the most expensive part from 7 years ago.

When the X570 chipset released, we saw a massive price increase in the mainstream segment with the top end boards reaching highs of around $750 to $800. For the previous generation, we only saw pricing like that on the HEDT side. Now Z690 boards can cost upwards of $1,100. Monoblock boards are almost double that with the ASUS ROG Maximus Z690 Extreme Glacial costing $1,999.99. The midrange is creeping up as well.

...motherboard's aren't being scalped. They are literally twice what they used to be a few years ago. Sure, there are economic and supply reasons for some of that increase. However, that doesn't account for the vast majority of the price hike.
Now that is most definitely true. Absolute insanity. I paid $400 or $500 for my HEDT X99 board in 2014, which I never would've ever imagined I would pay for a motherboard. Back in the nForce 4 days, $150-$200 would get you high-end boards, or even $130.


When you look at GPU pricing objectively, GPU MSRP's aren't really all that different than they were 5 to 7 years ago.
HAHA I didn't like those prices either!


The Maxwell based Titan X cost $1,299 or $1,399. That's the Titan class card which is the entry level workstation/prosumer segment while still being more or less at the top of the gaming stack. Later Pascal based Titan X's were $1,200 MSRP. Today, the RTX 3090 FE has an MSRP of $1,499.99 in the United States.
Yeah I guess I don't count entry-level prosumer/workstation cards as flagships. Or maybe what I should have said in my first post was "gaming flagships." Cuz when you start talking about Titans and 3090s and sh1t, I think you've left the realm of pure gaming cards. My beef is with the prices of pure gaming cards. Wasn't the 6800 Ultra like $600? I thought that sh1t was insane too. 2080 Ti, that was another one where I was like "what the fuuuuuuuuuuuck" with the price. The 3090 actually bothers me waaaaaay less than that.


The one real anomaly here was the RTX 20 series, which was insanely priced and remains the most insanely priced series by MSRP. Titan V which was priced at $2,499 while the RTX 2080 Ti was $1,199 or something like that.
Yeah, those things. Titan V's price was whatever, but that 2080 Ti price, yeah that sh1t was f*cking crazy.


Now, I can't speak for some of the mid-range and lower end offerings as I honestly don't look at them. I never have.
I always mainly looked at high-end/flagship and mid-range cards. I would usually grab the card that was just below the flagship. So like when Maxwell v2 series came out, I got the GTX 970 instead of the 980. I got the Radeon 9500 Pro instead of the 9700 Pro. Got the GeForce 2 Pro instead of the Ultra. Got the 8800 GT instead of the GTX (don't recall if the 8800 Ultra also launched at the same time, cuz then I guess I was 2 levels down instead of 1). Got the 7900 GT (later upgraded to 7950 GT courtesy of EVGA RMA to fix vRAM issue with 7900 GT) instead of the 7900 GTX. Stuff like that. But yeah, I compared GTX 970 at $330 to Radeon 9500 Pro at $200, and yeah those prices have been creeping up over time (of course I wasn't counting for inflation, so yeah you make some good points, f*ck inflation though).


I was critical of the motherboard price increases back then, and I'm still sour on it now.
As you very well should be.


I think I'm ok with Flagships going up as high as whatever people are willing to pay.
I don't think graphics cards in any class should have their prices increase from generation to generation. If the GTX 970 was $330 the 1070 shoulda been around the same price, not $400 or $450 or whatever the f*ck it was.


I hear you, but you also have to consider inflation.

That $400 - $450 in 2002 is equivalent to $630-$710 today.
Ugh. That's how sh1t works though.


...while it feels like only yesterday, 2002 was 20 years ago
GOOD GAWD. You don't have to put it so bluntly. I get it, I'm getting old. Sheesh.


780Ti - $699 MSRP
980Ti - $649 MSRP
Oh sh1t, an actual price drop!!! Wow, way to make me feel better about the 980 Ti's price, hahahahahaha!


2080Ti - $799 MSRP
Are you sure that was the MSRP? Dan mentioned it was something like $1200 or so, and honestly that's what I remember it being. One of my friends paid $1300 for his.
 
A) Inflation lately has been absolutely crazy, and is affecting everything. So there is that. But I don't think that accounts for price increases over the past couple of generations that GPUs have seen. Maybe you could make the case for Moore's Second Law coming into play.

B) I think price has to be relative to performance. Are today's flagships that much faster relative to the rest of the product line that previous generations were? I'm not comparing previous to next generation (generational/tech increase), I'm comparing inside of a given generation. You can make a seperate case for generational price/performance deltas, but that has to consider the entire lineup, not just flagship (more on that below).

C) Do you consider a card like the Titan to be a flagship in a given line or a separate-but-equal prosumer product line; and to follow up on that, a card like the 3090 to be a replacement for the Titan or Flagship for Turing?

I don't know the answer to B - it would be interesting to see some data, say, how a 2080Ti stacks up relative to other Pascal cards versus, say, a 1080Ti or a 980Ti.

Same with C, would be kinda the same as the previous statement only throw Titans into the mix.

Just some initial data:
780Ti - $699 MSRP
980Ti - $649 MSRP
1080Ti - $699 MSRP
2080Ti - $799 MSRP
3080Ti - $1,199 MSRP

I think I'm ok with Flagships going up as high as whatever people are willing to pay. I always have a budget and I look for the best I can that fits inside my budget. I don't necessarily need "Flagship", so top-tier cards and their prices are only relevant if they happen to fit inside that budget. But I do look for a certain amount of performance increase - I don't just upgrade because something is new: it has to be better than what I have, by a big enough margin, and fit inside my budget. Usually I look for cards that roughly double the current performance, and I wait until I can get that inside of about a $500 (+/-) budget. If that takes years, it takes years. At current pace, it may take decades... I have noticed it takes longer and longer to hit those metrics. It was 3 years. Then 4 years. Last one was 6 years and a big budget increase that I probably shouldn't have done.

What concerns me more is that we aren't seeing generational value increase. When I say value, I mean performance relative to price. Value in today's GPU market has actually been trending negatively -- it costs more to get the same performance today than it did 3-4 years ago. We have to reverse that trendline and get back to positive value, and then we have to sustain that positive value trend for a while just to get back to where we were with respect to performance per dollar spent.

And we could blame that on the current economy, except we started to see the value line degrade well before that -- when Turing came out, the prices started to edge up, but apart from RT/DLSS, rasterizing performance didn't move nearly as much. You could say the all-in value considering everything added to it, but you could also point to the number of gaming titles and such that supported it at the time and make a strong case that it doesn't add much, if anything at all (and you can still make that case with a straight face) -- the added value of the new tech is very much dependent on your willingness to futureproof and believe that tech would be utilized in the future efficiently. I'd also point out during this time AMD deprioritized graphics, particularly high end graphics, and that was very much a contributing factor: I don't hold either company blameless.

I would equate the value proposition of RT/DLSS/FSR to buying a game and hoping they fix it in the patches -- some people were happy with Cyberpunk when it shipped, some people think it's fine now after a few patches, and others are still waiting for it to be fixed: I see RT/DLSS in the same vein. Just like some people think RT/DLSS was totally worth it when Turing shipped, some people think there are enough titles out today to justify it, and others think it still isn't worth anything extra yet.

This is a huge part of what makes it difficult too. What are we comparing exactly?

My favorite example, the GeForce 3 series. I bought the fastest GPU on the planet, the GeForce3 Ti500 for $349.

In total that generation only got three models though. The Geforce 3 Ti200, Geforce 3 and the Gefoirce 3 Ti500. The TI500 was only ~37% faster than the Ti200. 37% was the total performance range for the generation.

There are so many more entries in a generation these days, and the range from bottom card to top card is much larger.

What are the Ti200 and Ti500 equivalent of in the 3000 series?

If the TI200 is equivalent to a 3050, then the Ti500 is equivalent to something ~37% faster than that. The closest thing we have is the 3060.

If the 3050 and 3060 cards are the equivalent comparisons, then the pricing doesn't look terrible.

The Geforce 3 TI200 cost $149 at launch in 2001. Adjusted for inflation that's $236.
The Geforce 3 TI500 cost $349 at launch in 2001. Adjusted for inflation that's $553.

MSRP for the 3050 is $250 and the MSRP for the 3060 is $330. Then the entry card is about the same price, and the high end card is actually much cheaper than it was back then.

If we do the other extreme instead, and consider the ti500 to be equivalent of the 3090, and pick a card that is ~37% slower than the 3090 as the equivalent of the Ti200 card then we wind up comparing the 3090 and 3070ti.

The 3090 is hugely more expensive at an MSRP of $1,499 vs the inflation adjusted $553 for the Ti500
The 3070ti is also more expensive at $599 vs the inflation adjusted $250 for the TI200.

The truth is that there is no gold standard or specification for what makes a high end card or a low end card in any generation. These things are defined by the results of each generations silicon once it is taped out, how that compares against eh previous gen, and then marketing gets involved and starts labeling things with names and prices. It's not completely arbitrary, but there is no set way to compare a card from generation to generation. Is the 2070 the appropriate thing to compare a 3070 against in the next generation? Who knows?

Also the fact that very few are successful in buying GPU's at MSRP this generation also adds complexity to the equation.

I think it is pretty clear that both MSRP's and actual prices of available product have gone up over the years, but the exact comparisons to make wind up being very complicated. There are no hard and fast answers. It's various shades of grey.
 
And we could blame that on the current economy, except we started to see the value line degrade well before that -- when Turing came out, the prices started to edge up, but apart from RT/DLSS, rasterizing performance didn't move nearly as much. You could say the all-in value considering everything added to it, but you could also point to the number of gaming titles and such that supported it at the time and make a strong case that it doesn't add much, if anything at all (and you can still make that case with a straight face) -- the added value of the new tech is very much dependent on your willingness to futureproof and believe that tech would be utilized in the future efficiently. I'd also point out during this time AMD deprioritized graphics, particularly high end graphics, and that was very much a contributing factor: I don't hold either company blameless.
Turing is an outlier, in my opinion. Putting raytracing on an aging node was Nvidia deciding to take the hit and crack the chicken and the egg problem. Raster performance was obviously sacrificed for features that to this day aren't broadly useful on most of the cards that debuted with it. RT on most of those cards is definitely just not good enough, and I'd bet that most folks aren't using the tensor cores for much of anything since DLSS no longer needs it.

And really, all shaken out, DLSS is the feature that most sells RTX - because it's almost necessary for raytracing, and because it just so very useful whether you're trying to push 4k to higher framerates or get reasonable performance out of a mobile GPU. Or even if you're trying to watch your power consumption / noise output etc.


Thing is, DLSS, FSR as an alternative, RT... none of that would be mainstream today if Nvidia hadn't bit that first bullet and put out what was otherwise a lackluster generation release with Turing.
 
Oh sh1t, an actual price drop!!! Wow, way to make me feel better about the 980 Ti's price, hahahahahaha!
Yeah, there you are seeing pressure from AMD and a good bit of the first mining bubble - AMD was selling pretty much every GCN card they could make. Amazing what competition can do.
 
Price scalping aside, if you ask me, the more egregious thing in the computer industry isn't GPUs but rather motherboard pricing. In 2015, the most expensive Z170 motherboard was $400 to $450. Now, we had HEDT boards upwards of around $500-$600 but you got a lot more for your money in terms of the platform. Today the most expensive mainstream motherboard is the ASUS ROG Maximus Z690 Extreme at $1,099. That's not getting into the limited production offerings with factory monoblocks which are understandably more expensive than that. However, the Maximus Z690 Extreme is still a mainstream part and its double the price of the most expensive part from 7 years ago.

I'm with you on this, but even your old 2015 comparison prices were a little bit on the crazy side.

It doesn't seem like so long ago a typical motherboard cost you between $79 and $129. The most expensive one ever was probably ~$250 for an over-the-top board. As recently as 2010, a $100 motherboard was a pretty **** good motherboard.

When I bought my Asus P9x79 WS in 2011 for $399 it was an absolutely unheard of ridiculously high price for a motherboard. I justified it based on the fact that it was a "workstation board" (whatever the hell that really means anymore) and that it probably would last me longer than ones I had had in the past. Between my $600 i7-3930k and $400 motherboard I had buyers remorse for months. I couldn't believe I had just dropped a grand on only a CPU and a motherboard, something that combined had used to cost me ~$350.

Little did I know what was coming.

What I paid for my Asus ROG Zenith II Extrreme Alpha makes me cringe a little.

Now granted, they have been adding more features to them (most of which I don't want or need) and the chips they put on them have been going up in price, but still. The price increases in motherboards have been ridiculous and unwarranted.

It feels a lot like they added go-fast LED's and raceboy shaped aggressive looking non-functional (or at least, no more function than traditional ones) heatsinks, and then cranked the pricing knob to 11.
 
A better indicator would be the entry level card prices

750 - $119
950 - $159
1050 - $105
1650 - $149
3050 - $249

You'd have to adjust those for inflation:
750 - $119 in Feb 2014 is $141.31 in Dec 2021
950 - $159 in Aug 2015 is $186.01 in Dec 2021
1050 - $105 in Oct 2016 is $121.10 in Dec 2021
1650 - $149 in Apr 2019 is $162.56 in Dec 2021
3050 - $249

So, the prices have still trended upwards, but not as much as inflation makes it seem.

There are two outliers, an unusual dip with the 1050 in 2016. Was this when the last crypto-crash happened maybe?

And then there is the 3050, which even after adjusting for inflation is much higher than the rest. This is probably just Nvidia trying to cash in on some of the crazy pricing right now instead of giving it all to the AIB's, retailers and scalpers.

In the end, the final market price is determined by supply and demand. Where Nvidia sets the MSRP just determines how big of a chunk of the profits they get, vs how big of a chunk go to the downstream supply chain.
 
Become a Patron!
Back
Top