The First Batch of NVIDIA GeForce RTX 50-Series “Blackwell” GPUs Are Rumored to Use the Same 384-bit Bus as the RTX 4090

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
8,137
Points
113
The First Batch of NVIDIA's forthcoming "Blackwell" GPUs is rumored to retain the same memory bus specification as its current flagship card. This rumor aligns with others that state while GDDR7 is on its way and will be used in some next-gen consumer graphics cards, the first iterations are not expected to utilize its maximum frequency potential which could be as high as 36 Gbps vs 24 Gbps of GDDR6X. While GDDR7 could, at some point, be used in 512-bit bus configurations, it would require new memory controllers and if this latest rumor is true, it could mean that NVIDIA has, for now, chosen, to stick with the older spec in order to get the first batch of Blackwell GPUs into production.

See full article...
 
Looking forward to these. I'll take one with >=16GB of memory in FE configuration, thanks!

(cooler will promptly be removed in favor of a waterblock, most likely a Heatkiller from Watercool)
 
Honestly I wouldn't settle for less than 24 gig today. For AI tinkering, For gaming yea 16 gig would be the minimum.
 
Honestly I wouldn't settle for less than 24 gig today. For AI tinkering, For gaming yea 16 gig would be the minimum.
I haven't jumped off the 'AI' cliff just yet, but I agree I'd want quite a bit more if I did
 
Impressive, but the 4090 is the end of the line for me.
 
Impressive, but the 4090 is the end of the line for me.
Kinda thinking the same thing about my 3080. I mean, my 980 lasted me like 7 years….

I love building computers but the incremental gain versus cost has just tilted it so far away from the every year event it used to be nearly three decades ago to … when something finally breaks.
 
Kinda thinking the same thing about my 3080. I mean, my 980 lasted me like 7 years….

I love building computers but the incremental gain versus cost has just tilted it so far away from the every year event it used to be nearly three decades ago to … when something finally breaks.
Start tinkering with AI. It's a whole new stratosphere of performance needed. Compute actually serves a purpose again with it. And really once games start implementing AI that runs locally you'll be right there and ready for it. :)
 
Start tinkering with AI. It's a whole new stratosphere of performance needed. Compute actually serves a purpose again with it. And really once games start implementing AI that runs locally you'll be right there and ready for it. :)
I think we are a very long way from games using local AI for anything. It's just too unpredictable. I'd love it if I was able to have unconstrained conversations with NPCs in games. But the risk for controversy is too big for them to let that happen. There are too many snowflakes who can't differentiate between fiction and reality.
 
I love building computers but the incremental gain versus cost has just tilted it so far away from the every year event it used to be nearly three decades ago to … when something finally breaks.
My thing is I'm in my early 50's and find it hard to like many of the new games that come out. Most are set up for online only, which at this point I'm not into anymore. The 4090 should last me a while and when it can't keep up my interest in gaming will have dropped as well.
 
Yeah. If I had a use for it I would, but throwing hardware at it just for the sake of throwing hardware - I’ve chased that when I was younger at it was other things, and it just isn’t for me.
I mean I get it. AI is going to have a big impact once people learn how to properly implement and teach it. (Myself included.) You know right up until it takes over and and over funds people in order to push them into wildly inappropriate behavior that leads them to early death in order to thin the population with nobody complaining about it.
 
Become a Patron!
Back
Top