NVIDIA GeForce RTX 40 Series Flagship Rumored to Feature 18,176 Cores, 48 GB (24 Gbps) Memory, and 800-Watt TBP

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,614
Points
113
NVIDIA may be prepping an especially monstrous product for PC gamers.

Go to post
 
I caught a Jayztwocents video that he stated from one of his sources that these cards could be delayed. This would not surprise me. Also 800W is starting to get to space heater territory..
 
Last edited:
That is truly bonkers. We need someone to make a BIG efficiency leap.
 
That is truly bonkers. We need someone to make a BIG efficiency leap.
I think...

neither major camp has been ignoring efficiency, I don't think. It's just that we're in that wierd spot where the nodes aren't jumping as quickly as they were, so you have that leading to smaller gains from manufacturing that you once had.

That said, the competition hasn't stopped, and our appetite for progress has only gotten larger.

So the only avenue left is to stack on more cores and crank up the power. Eventually that too will hit limits - monolithic chips will get too big, multichip module interposers will get too complicated, data transfer speeds will eventually start to cap things off... something.

But yeah, I'm pretty sure if you were to look at stock designs over the last few generations, for a given level of performance, efficiency has gone up. Often by a good bit. It's just that performance hasn't really followed, so they throw more cores and push power limits.
 
That is truly bonkers. We need someone to make a BIG efficiency leap.
They get more efficient most generations. What we're looking at is not an efficiency regression, in terms of getting lower performance per watt, but rather seeing an increase in the maximum power draw limits that accompany some performance gains.

Unlike an efficiency regression - or even efficiency improvement (velocity) regression, we're just seeing higher-performance options being available for those that are willing to foot the cost of purchasing and running them.

For those worried about efficiency or just energy thriftiness, lower-powered models are certainly available :)
 
GeForce RTX 4090? Titan?
Great way to turn those unused hydronic baseboard heaters into a central GPU heating system. Not terribly efficient but MOAR FPS!
 
GeForce RTX 4090? Titan?
Great way to turn those unused hydronic baseboard heaters into a central GPU heating system. Not terribly efficient but MOAR FPS!
All of this has happened before, and all if it will probably happen again - with the improvement in both wired and wireless interlinks, for those that need on-site computing power, a move back toward centralized big iron starts making sense. Even at home!

And why not integrate it into HVAC? If there's a heating need but not a compute need - sell the compute!
 
All of this has happened before, and all if it will probably happen again - with the improvement in both wired and wireless interlinks, for those that need on-site computing power, a move back toward centralized big iron starts making sense. Even at home!

And why not integrate it into HVAC? If there's a heating need but not a compute need - sell the compute!
I sure hope there's an implicit smiley face in there. Sometimes it's hard to tell. ;)
 
GeForce RTX 4090? Titan?
They need to bring the Titan branding back. It was never cheap which should alleviate some of the confusion about the pricing and performance for them. I know the x90 cards predated them were halo products unto themselves and used to feature 2x GPUs on the same PCB but the Titan nomenclature always separated it from the pack. The whole stack is just so confusing now between x80, x90, TI, and Super, and then the absence of the Titan. From Maxwell to now NV has really gotten messy with its branding, especially when the refresh cycles kick in towards the end of a product generation.
 
They need to bring the Titan branding back.
This I agree with, it would have made much more sense for the 3090 / 3090 Ti, and would have highlighted the main driver for the product - the extra VRAM for content creators. Though I do get the 8K angle, from a marketing perspective. Don't agree with it, but it was valid.
The whole stack is just so confusing now between x80, x90, TI, and Super, and then the absence of the Titan. From Maxwell to now NV has really gotten messy with its branding, especially when the refresh cycles kick in towards the end of a product generation.
Makes it confusing for reviewers and consumers - as well as for competitors and competitor marketing teams.

Not sure if that really works for Nvidia, but it does allow for some flexibility for the marketing / branding folks when dealing with competition, reception, and production issues.
 
Nope - serious, if not somewhat far-fetched at the moment.
Not really there were companies that were making 'heaters' for offices that were nothing but compute engines they would pay you to host for the benefit of generating heat in colder climates for effectively free. They even looked like fancy wall mounted radiators. Not sure if it ever went anywhere. Lets see if my google foo can find them...

No I can't find it but it was a thing... dang.
 
They get more efficient most generations. What we're looking at is not an efficiency regression, in terms of getting lower performance per watt, but rather seeing an increase in the maximum power draw limits that accompany some performance gains.

Unlike an efficiency regression - or even efficiency improvement (velocity) regression, we're just seeing higher-performance options being available for those that are willing to foot the cost of purchasing and running them.

For those worried about efficiency or just energy thriftiness, lower-powered models are certainly available :)
Look I HAVE the power to spare... it just seems like we need some gains in efficiency... Otherwise the power issue is going to go off the rails... if it hasn't already.


Like we need someone be it AMD or Nvidia or heaven forbid... intel to come along and say...

Here is our newest top tier video card. With current BIOS it will run every triple A game even Cyberpunk with RT at 4k at 100hz refresh. This is a limit we put on the card by design. You can go into the video card bios and unlock this for your own purposes but the design of this card is to play any current game at 100hz. Oh and it will only use 250watt TDP to do it.

While sporting a MASSIVE heat sync and power chokes/delivery clearly able to go far beyond that.

Someone needs to draw a line in the sand... and really I think targeting that value will be appreciated especially if you can unlock more power as more demanding games come along. Maybe Oooo a clock profiler PER big game. Hummm... yea I like that.
 
I sure hope there's an implicit smiley face in there. Sometimes it's hard to tell. ;)
I remember about three of four years ago someone was selling a crypto mining space heater. It was just a headless PC for the most part - you connected it to Wi-Fi, and it had a thermostat that would start/stop computations around whatever temperature you set.

I should see if I can find it again.
 
Look I HAVE the power to spare... it just seems like we need some gains in efficiency... Otherwise the power issue is going to go off the rails... if it hasn't already.
I think one of two things will happen first.

First off there is a hard limit that most people can pull from the wall without hiring an electrician. Most typical US homes run their bedrooms / offices on 15A 120V circuits. To go beyond that will be a big leap - it will require a disclaimer that you can’t just plug it into any old outlet and have it run — so I think, at least for consumer purposes — that represents a ceiling.

The second will be the heat / noise that a consumer will be willing to put up with. An 800W card, together with a CPU cranking 100-200W, along with a 150-250W monitor, plus anything else in the room… you do have what amounts to a space heater. That will either be very large or very noisy. And it’s going to throw off a lot of heat. Enthusiasts may put up with it just so they can post those sweet screenshots of crazy benchmark numbers, but ~most~ consumers won’t fool with it.

Just like most consumers don’t drive exotic sports cars, even among those who could afford them, but we all like to drool over them
 
I think one of two things will happen first.

First off there is a hard limit that most people can pull from the wall without hiring an electrician. Most typical US homes run their bedrooms / offices on 15A 120V circuits. To go beyond that will be a big leap - it will require a disclaimer that you can’t just plug it into any old outlet and have it run — so I think, at least for consumer purposes — that represents a ceiling.

The second will be the heat / noise that a consumer will be willing to put up with. An 800W card, together with a CPU cranking 100-200W, along with a 150-250W monitor, plus anything else in the room… you do have what amounts to a space heater. That will either be very large or very noisy. And it’s going to throw off a lot of heat. Enthusiasts may put up with it just so they can post those sweet screenshots of crazy benchmark numbers, but ~most~ consumers won’t fool with it.

Just like most consumers don’t drive exotic sports cars, even among those who could afford them, but we all like to drool over them

All of what you say is true, but on the flip side MOST consumers won't be buying the flagship card this story is reporting on.

I presume their more mid level parts will have more conventional power draws.
 
Become a Patron!
Back
Top