Latest Rumors Suggest That NVIDIA Is Delaying Its RTX 50 SUPER Series Indefinitely, and the RTX 60 Series May Not Arrive Until Mid-2027

Unfortunately, it makes sense. Prices are surely skyrocketing soon thanks to memory shortages. This was the perfect window for the supers to be released if not for the current situation. We may not even see RTX60 series until late 27 at best.
 
Last edited:
Yeah, it wouldn't surprise me to see Super get passed over this time around. RTX 5 series cards are still plenty powerful, their chip cuts don't really leave a lot of room for compute improvements, and there is no universe consumer cards are getting higher RAM variants right now.

Not the worst thing in the world if no refresh leads to slowing sales, which leads to discounts and dropping prices.

Frankly I think we would be better served with lower prices for existing performance over 5% more performance for the same (or more) money right now.
 
Sad part is that unless ram prices go down, which is not happening anytime soon, we'll have the worst scenario; no improvement with price increases. At some point, NVIDIA is going to have to raise prices on the current product stack because it can't obtain VRAM for them indefinitely at whatever they've been paying. I'm amazed nobody else is talking about this while downplaying the importance of what has led to why these may not get launched.
 
I agree.

By the time AMD is ready, who knows how high VRAM costs will be, and now that NV has their fingers into Intel, the likelyhood they'll ever make something that can compete is getting closer to nill.

May not be this year or next, but it sure feels like the writing on the wall is getting closer and closer.
 
I think the greater risk is Nvidia abandoning the end user market and stopping driver development cold and only developing for cloud solutions.

Nah, consumer is a "small" chunk of their revenue, but that small chunk is still like ~4B dollars or something a quarter.

All these guys know the AI thing is going to cool,.alternate chip designs or AI algos.are.going to reduce compute requirements, etc.

They're going to keep gaming fridged for sure, but not trashed.
 
Nah, consumer is a "small" chunk of their revenue, but that small chunk is still like ~4B dollars or something a quarter.

All these guys know the AI thing is going to cool,.alternate chip designs or AI algos.are.going to reduce compute requirements, etc.

They're going to keep gaming fridged for sure, but not trashed.
I hope you are correct. But lets be honest. Lets say they bottom out gaming production, keep making drivers and trickle out product as they see fit. Gamers will still buy whatever they make when/if they decide to open the taps again.

Hell if I could get a 5090 right now for 2-300 above MSRP I would. (Meaning 2200-2300)
 
Hell if I could get a 5090 right now for 2-300 above MSRP I would. (Meaning 2200-2300)
Pretty much what I paid for my first one when I got it as part of a boutique PC that became my launch point for getting into AM5. That rig was maybe $100-200 over the parts it had, which also included 9800X3D, 32 GB DDR5 6000 MT/s. A month or so later managed to trade in an EVGA 3090 Ti to Newegg and paid under MSRP (~$1900)for a 2nd. I do count myself lucky in both instances.
 
I see the end of gaming GPUs.
Why spend time and resource when you simply cant keep up with AI demand.
A couple billion is pocket change to a Trillion dollar company.
Memory is all going AI, which further complicates the issue. No DDR sticks, no nvme, no VRAM. So might as well buy a PlayStation before they cost 1000 dollars too.
It was fun while it lasted, but nvidia has moved on.
I say this regretably, because I review video cards. Probably have seen my last new GPU until 2027......if then.......(this opinion is solely my own and not affiliated with FPSReview in any form. )
 
Last edited:
That's pretty much what I'm thinking is happening as well. I wish it were not the case, but too many factors are pointing in that direction, and that's why I went all in with the current flagship offerings. Ought to be good enough for more than a few years to come, and by then, I'm pretty sure my gaming interests will wane further as my age sets in more.
 
I really don't see the end of gaming GPUs. The revenue is 'small' by comparison, but Nvidia's costs to keep it going are relatively small, too.

Note also that 'desktop' computing includes content creation stuff, for which the gaming GPUs also excel, and a market where Nvidia has a commanding lead just due to driver integrations.

Imagine AMD giving up on Radeon just because their desktop marketshare has been pummeled almost to irrelevance... AMD isn't giving up the fight.



Instead, while we're certainly in an AI-bubble driven DRAM drought for a year or so, I see AMD and Nvidia both looking for more fab partners. If/when Intel gets their manufacturing house in order, we could easily see Intel fabbing dies for either or both companies, and obviously Samsung is likely to make a return at least on the Nvidia side.

And maybe the spin up of PRC DDR5 production will help alleviate some of the global DRAM demand; if it checks out, it will have no end of customer orders domestically at least.
 
When the AI market cools it will be interesting if we end up with a supply glut and cratering prices too.

One can dream.
 
The real future of desktop gaming will be SOC solutions from Apple and AMD.
I ain't gonna lie, that's definitely going to be part of it. I'd only contest 'how much'. SoCs are expensive, especially if you want to invest in both GPU horsepower and memory bandwidth.

When the AI market cools it will be interesting if we end up with a supply glut and cratering prices too.

One can dream.
It's definitely a dream!

Unfortunately, DRAM manufacturers are a cartel - and have been found guilty of such repeatedly. GPU prices might stagnate a bit, but I don't think that producers are in a hurry to get DRAM back to where it was before and the effect of this demand spike is going to have a long economic tai..
 
I really don't see the end of gaming GPUs. The revenue is 'small' by comparison, but Nvidia's costs to keep it going are relatively small, too.
The issue isnt development time, costs, or support.

If you have a finite amount of resources (in this case - fab capacity and RAM) — and you can make XX% margin making one product, but XX% times 5 making another…

the first product will just bleed a slow death until you can get everyone some acceptable alternative that somehow leverages the second …

And here it’s cloud-based datacenter gaming using datacenter grade equipment.
 
And here it’s cloud-based datacenter gaming using datacenter grade equipment.
Like I see some people accepting this. It's 'good enough' to experience higher-end games and it's portable. I could see myself using it if I ever found myself gaming on the go, as I'm certainly never buying another 'gaming' laptop.

But I don't think that this will even begin to replace local hardware for gaming. Augment, expand the market? Sure!
 
I know it's really a major dumb down, but we could see the equivalent of Atari ET followed by a future Nintendo/16-bit event regarding all of this. By the time AI crap cools down there could be someone entering the vacuum created by these idiots to fill the space they left behind.
 
Become a Patron!
Back
Top