NVIDIA GeForce RTX 40 Series Graphics Cards Could Consume Up to 850 Watts of Power, It’s Claimed

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,595
Points
113
nvidia-geforce-rtx-30-series-graphics-cards-lined-up-1024x576.jpg
Image; NVIDIA



The rumor mill has begun turning again with new purported insight regarding NVIDIA’s next-generation graphics cards, the GeForce RTX 40 Series, and this time, they indicate that the new family based on green team’s new Lovelace architecture could feature models that consume a higher amount of power than anyone might’ve imagined.



According to a tweet shared by leaker Greymon55 today, NVIDIA GeForce RTX 40 Series graphics cards that leverage the flagship AD102 GPU currently feature TGP ranges of 450 watts, 650 watts, and 850 watts, substantial increases over the GeForce RTX 3090’s 350-watt TGP. Greymon55 clarified that these specifications aren’t final, but higher power consumption in next-generation GPUs seems like somewhat of a sure thing based on other incoming advancements such as PCIe Gen 5 power supplies, which can deliver up...

Continue reading...


 
LTT has a great video on the next gen chips. 75% improved performance with 1/2 the power consumption. Now it's for very specific types of tasks. But the core chips are similar, but on a 7nm technology as opposed to the 8nm from Samsung. The next gen chips should be on TSMC's 5nm process which might give us even more improved performance.
 
That doesn't seem practical.

I'm already pulling ~850W from the wall with a Threadripper and a 6900xt with a power limit upped to 430W.

Add another 410W to that and my excellent 1200W Seasonic Prime Platinum PSU won't even be enough.

I'd be heating the **** room, and drinking power like a space heater...

I've always been all for extreme performance, but this seems a little much.
 
That doesn't seem practical.
Yeah it sounds great on paper to the "performance at any cost" crowd, but the practicalities of anything burning this much power outside of some sort of specially constructed venue (like a data center), it's going to be way beyond what any consumer is expecting.

That, and 850W, even in a triple slot, is a really small form factor for that. I mean, think about it - a lot of us folks here are using full sized cases for computers where the entire power draw is 850, and still complaining that it is hardly enough. Now cram all of that into a triple slot PCI space. And sure, you ~could~ go quad, you are already looking at mechanical issues with triple-slots and the weight and size of them.

Every single newegg review will be "It's huge and doesn't fit in my mini case. It runs hot!!! It heats up my room like a space heater!!!"
 
And sure, you ~could~ go quad, you are already looking at mechanical issues with triple-slots and the weight and size of them.
Support brackets are a thing, for just this reason.

I wound up digging into the problem because the first 5700XT I tried to use for case reviews was physically warped. It didn't even sit right when the case was on it's side - lots of display corruption and the drivers wouldn't load. It would however work when held 'up', with pressure pushing it in the direction of the CPU socket.

Bigger issue I think is that there's just been no progress on figuring out how to cool cards with that level of power draw quietly. I'm surprised at how quiet my 3080 FTW3 12GB is, and it can handle 450W - it doesn't even look that big in my Lian Li O11 XL. But at 450W, we're already beyond what your average CLC can handle. You'd need something thicker and made of copper, and with a full-coverage block. You'd probably also want the pump in the radiator to maximize mounting options.

And then you'd need a pile of disclaimers relating to mounting issues!
 
Yeah, it's going to be interesting when these things do hit the market. If this pans out as true then it's like a finger to the PSU manufacturers who just started rolling out the gen 5 models with only one slot. This will either need 2x gen 5 plugs or an 8x pin/gen 5 combo since gen 5 is only rated up to 600W. Kind of ironic since NV was the first to tout the new standard on its cards.

On that note, the 16-pin gen 5 plug will be a must so the system can work with the PSU to help control the power draw otherwise just imagine what that poor PSU will go through when the card goes from idle to full throttle while gaming and everything in between.
 
I'm still thinking that we're unlikely to see anything like 850W. It's hard to imagine a GPU that could use that much power, especially since the next round is broadly expected to be produced on TSMC 5nm.
 
I'm still thinking that we're unlikely to see anything like 850W. It's hard to imagine a GPU that could use that much power, especially since the next round is broadly expected to be produced on TSMC 5nm.
Oh I don't think it would be that hard to imagine. Crank the clocks up on (pick your favorite GPU) and you can some eye popping numbers, even with architectures and form factors designed to get nowhere close to that TDP. Here's an article about a 3090 hitting 630W -- and there may have been more in the tank - the cooler just couldn't keep up. Granted, it didn't help performance much, as you are hitting a wall with regard to the architecture and process node there, but still.

If nVidia (or anyone, really) wanted to hit 850W+, it wouldn't be hard at all to do. The harder parts would be making efficient use of that power and then putting it in a consumer friendly package - which.... is are entirely different engineering challenges and either of those probably much larger than just arbitrarily turning up clocks on a big die until you hit some power draw.
 
I guess I should clarify - yeah, I could see a GPU being cranked to that level of power draw, well out of its efficiency zone and perhaps losing performance.

It's more that there isn't much use for it, not at least until we start getting chiplet designs going, which doesn't seem to be a feature of the 40-series or even necessary just yet.
 
Okay, let's be serious now.

They're gonna include a phase change system with them.
Yeah, almost would have to. Even an open loop system will need some beefy radiators and fans to handle that much. A lot of folks feel uncomfortable running 200W on a 360mm

I stole this graph from EKWB, it shows wattage (thermal performance) as a function of fan speed and radiator type/size. The top performing radiator does approach 750W in this graph, but you are at full tilt on some nice fans there too.

EK_Radiator_Performance_Chart_3.png
 
Maybe I was absent that day in college, but what is the unit shown on the vertical scale (W/10K) ?
 
Maybe I was absent that day in college, but what is the unit shown on the vertical scale (W/10K) ?
EKWB explains in the article that was referenced:

The fundamental rule of radiator performance testing is to see how well the radiator cools the coolant. So for us, the most widespread way of describing radiator performance is by using W/10°C, or in other words, Watts per 10 Delta T (sometimes K is used instead of ΔT). This relation tells you how many watts the radiator can dissipate when the coolant temperature rises 10°C above the ambient temperature. If you are not familiar with the ΔT term, take a look at the previous “What is Delta T“ blog.

Basically, they are measuring how much thermal energy a given radiator at a given fan speed can dissipate before it can no longer maintain a 10°C temperature rise in the coolant. That gives them a standard of measurement to make heat rejection comparisons.

You could make a case saying that the radiators would dissipate even more heat if you allowed the dT (or K, as they refer to it - could refer to Kelvin?, which with respect to differential temp would be identical to degrees centigrade) to increase beyond 10°C, and it would definitely - radiators get more efficient as you allow the temperature to rise beyond the ambient air temperature. But you also have to make sure the return coolant is cool enough so that the water blocks can sufficiently cool the equipment (they operate under the same principle as a radiator, after all, your just looking at the temp of the die underneath the block instead of ambient air).

10°C is a pretty good standard to shoot for in my opinion. This is also using a standard pump and block configuration, so the loop flow is held constant - that would be another variable that would affect this beyond just fan speed.
 
Last edited:
For sure. At this rate, enthusiasts will need to double-check the wiring for a room, and the fuse box, before setting their rig up. Getting a UPS for all that is going to add to the costs as well and if you're like me and have a 1000W receiver plus 65" T.V. in there also it only gets even crazier.
 
This just doesn't sound practical. Wattage of CPU's going up, especially on the Intel front, and then massive wattage on the GPU side.
It's not just Intel - and it's been up for a while. It's more a matter of just how inefficient you're willing to go as you can make higher-end CPUs from both AMD and Intel suck down some juice!
 
Become a Patron!
Back
Top