NVIDIA Reiterates 750-W PSU Requirement for GeForce RTX 3090 and GeForce RTX 3080

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,183
Points
113
nvidia-geforce-rtx-3090-angled-1024x576.jpg
Images: NVIDIA



Prospective GeForce RTX 30 Series owners may need to spend additional money on a bigger power supply. NVIDIA has released an infographic confirming that a 750-watt PSU is “required” for its 8K monster, the GeForce RTX 3090, as well as its newest flagship, the GeForce RTX 3080. The fine print suggests that users may be able to get away with a weaker PSU, but a 750-W power supply seems like the way to go for peace of mind.



The infographic also gives us a pretty clear idea of how much space these GPUs will take up. The meaty GeForce RTX 3090 will take up three expansion slots and require 12.3″ (313 mm) x 5.4″ (138 mm) of clearance, while the GeForce RTX 3080 will take up two expansion slots and require 11.2″ (285 mm) x 4.4″ (112...

Continue reading...


 
That seems kinda steep. I've been running a 750 watt PSU for many years now (Corsair, EVGA G2, G3) but not out of necessity. Even when I was running an overclocked Bulldozer and AMD 290X video card the PSU fan hardly ever even turned on. Guess we'll have to wait and see when the reviews start dropping but I'll be surprised if a 750 watt PSU is actually "required" for a single 3080. But I've been wrong before.
 
It depends on the rest of the system.
If the 3090 is 350W stock, overclocked it will likely be able to use 20%-30% more
So just the GPU using up to 450 watts in that scenario.
 
Not a problem for me, with a Seasonic Prime Platinum 1200W. I know it's overkill but I just couldn't pass on the deal I got on it. I think it was $130 brand new.
Ha! I went the other way recently. My Corsair HX850 was giving me issues in the past year, so I downsized to a Seasonic 750W unit that was on sale. With my overclocked 9900K and 2080 Ti it's pulling around 550W from the wall while gaming. Add another 100W from the 3090 and I'll be starting to sweat. Starting to think I should have stayed with 850W on the new unit...
 
Keep in mind, that folks here probably know a quality PSU from a POS.

nVidia is dealing with general population - most of which don't know a PSU from ... well, pick a noun.

So they have to pick a safe wattage that a typical, run-of-the-mill PSU can safely support, also assuming some standard PC equipment, and that's before taking into account all the auto-one-click-overclocking that can occur today. That 750W should be a very, very conservative number with a good deal of headroom when your looking at stock cards in a vanilla PC and no OCs anywhere in the system, but you know there are idiots out there who are going to stick one of these in a Packard Bell with a 276W stock PSU that requires 4 PSU adapter cables and wonder why it doesn't work right.

Start throwing OCs in there though, and all those wattage numbers go right out the window.
 
I haven't heard it as much as I used to but I used to read that when picking a PSU it was best to double up, or at least go higher, for what you planned to use. Expecting your rig to pull around 500W+ then get 750 or more. Back then it was due to PSU's operating at their optimal levels at around half load. I'm not sure if that's still true though. In either case, glad I don't have anything less than 1000W in the house and all are either 80+ gold or platinum rated Corsairs(not the best but they've been good for me).

Not too sure about what the 3080 will do but if it draws more than a heavily OC'd 2080 Ti, coupled with a 6, 8, or more, core processor that is also running at high clock speeds, I can easily see a rig pulling over 500W. I think their recommendation is probably spot on for the 3090 since most people are not going to drop $1500-$2000 on a GPU into a low speed/powered rig that will probably bottleneck it.
 
I haven't heard it as much as I used to but I used to read that when picking a PSU it was best to double up, or at least go higher, for what you planned to use. Expecting your rig to pull around 500W+ then get 750 or more. Back then it was due to PSU's operating at their optimal levels at around half load. I'm not sure if that's still true though. In either case, glad I don't have anything less than 1000W in the house and all are either 80+ gold or platinum rated Corsairs(not the best but they've been good for me).

Not too sure about what the 3080 will do but if it draws more than a heavily OC'd 2080 Ti, coupled with a 6, 8, or more, core processor that is also running at high clock speeds, I can easily see a rig pulling over 500W. I think their recommendation is probably spot on for the 3090 since most people are not going to drop $1500-$2000 on a GPU into a low speed/powered rig that will probably bottleneck it.
The theory is that power supplies operate most efficiently within a certain load range. While that might be true from a data perspective, reality is that efficiency with a quality power supply does not fall off much on either side of the peak, which is around 70-80% load. As with memory, unused power is basically wasted power but it doesn't hurt to have a buffer or reserves.
 
Ooofs this generation is really putting the hurt on the extreme ITX system crowd... I can see myself getting a 3090 in my system if I dropped my OC a tier or two down, did not OC my 3090, and get lucky that they have one that even fits.
 
Ooofs this generation is really putting the hurt on the extreme ITX system crowd... I can see myself getting a 3090 in my system if I dropped my OC a tier or two down, did not OC my 3090, and get lucky that they have one that even fits.
Honestly I'd just get a 3080 for an itx system it is good enough for years to come. And if it suddenly becomes obsolete that would make the 3090 almost as obsolete as well.
 
Honestly I'd just get a 3080 for an itx system it is good enough for years to come. And if it suddenly becomes obsolete that would make the 3090 almost as obsolete as well.
Logically you are right but I typically act quite irrationally when upgrading my GPU!
4K144 with the 3090 seems rather possible if everything pans out.
I wager a guess that we would likely see a game like RDR2 running at 4k 70-80 FPS on a 3090, which would be a nice experience.
 
Become a Patron!
Back
Top