World’s Fastest Gaming CPU Could Be Holding Back the NVIDIA RTX 5090 at 4K

All very interesting.

Anyone have a spare $2k and an extra, hmm, how much would my energy bill go up? 7900 xtx to 5090?

I asked copilot. Not sure it got the idle and light to medium power usage right though. Anyone have a better model? Or perhaps a better question (that makes me think of the Hitchhikers guide to the Galaxy).

Asking CoPilot:

what is the tdp of the rx 7900 xtx?

The AMD Radeon RX 7900 XTX has a maximum power draw (TDP) of 355 watts. However, some variants like the Sapphire 7900 XTX Nitro+ have a higher TDP of 420 watts.
Is there anything specific you'd like to know about this graphics card?

Assuming 350, and that I am in Reno Nevada and game 4 hours a day and have a light to medium load 10 hours on a computer with a 670e chipset and ryzen 7950x3d cpu, just changing the gpu from the 7900 xtx to an NVidia 5090, please estimate for me, how much I can expect my energy bill to increase?

Sure, let's break it down step by step:
  1. Power Consumption:
    • AMD RX 7900 XTX: 350 watts
    • NVIDIA RTX 5090: 575 watts
  2. Daily Usage:
    • Gaming: 4 hours/day
    • Light to Medium Load: 10 hours/day
  3. Electricity Cost in Reno, NV: 11.05 cents/kWh
  4. Monthly Usage Calculation:
    • Gaming: 4 hours/day * 30 days * 350 watts = 42 kWh/month
    • Light to Medium Load: 10 hours/day * 30 days * 350 watts = 105 kWh/month
    • Total for RX 7900 XTX: 147 kWh/month
    • Gaming: 4 hours/day * 30 days * 575 watts = 69 kWh/month
    • Light to Medium Load: 10 hours/day * 30 days * 575 watts = 172.5 kWh/month
    • Total for RTX 5090: 241.5 kWh/month
  5. Monthly Cost Calculation:
    • RX 7900 XTX: 147 kWh * 0.1105 = $16.23/month
    • RTX 5090: 241.5 kWh * 0.1105 = $26.68/month
Estimated Increase: $26.68 - $16.23 = $10.45/month
So, you can expect your energy bill to increase by approximately $10.45 per month if you switch from the RX 7900 XTX to the NVIDIA RTX 5090.
 
I envy your 4hr a day availability. I consider it a good week if I get 4hrs total on a weekend.

Haven't crunched those numbers but given daily usage it seems plausible. Thankfully we were able to get solar installed a few years back so electricity isn't normally and issue any more. Paid off a while back and now I just smile when the bill comes. BTW my 4090 pulls between 420-520 when I do game, and sometimes slightly higher but pretty rare there, with most of the titles I like to play at 4K, so not that far off from a 5090.

Edit: Side note, the 4080 SUPER I have is probably the most energy efficient card I have ever seen. Between ~260W upto ~330 (rare and extreme cases) for 4K w/ DLSS quality at 60-120 FPS. It's almost more fun watching the TDP than gaming.
 
How do you check your watts?

I can check at the PSU level, using ICUE, to see how much it is taking from the wall and providing to my system, I dont get any kind of sub system breakdown though. Can Windows 11 see that? I dont see anything in the Resource monitor, or is there an App?

Thanks :)

Oh me at light use:

1738282586100.png
Me with POE 2 running - moderate to heavy use:

1738282769907.png

I've seen it get up to 600-650 but just spikes not sustained - that includes the 7950x3d, 3 NVME, a Sata, etc.
 
How do you check your watts?
Mostly just by MSI AB but occasionally I'll take a look at what the UPS says for actual from the wall draw. It's rare when my 4090 rig pulls more than 600W but I think the most I've seen has been upwards of 750ish with real-world gaming and not just benchmarking to see something maxed out.
 
I asked copilot. Not sure it got the idle and light to medium power usage right though. Anyone have a better model? Or perhaps a better question (that makes me think of the Hitchhikers guide to the Galaxy).
Pretty interesting results. I'd say it's not bad as just a differential between two cards. But it's assuming full TDP for "light to medium use" for your cards. Not sure what the idle draw on these cards are - that number is harder to find sometimes.
 
Small correction:

Per TechSpot:


RE: Marvel’s Spider-Man 2 Remastered: “At 4K, the CPU bottleneck is mostly removed, but the RTX 5090 still appears slightly limited, averaging 212 fps. As a result, the RTX 5090 is just 26% faster than the RTX 4090.”

TechSpot/HUB tested Spider-Man, not Spider-Man 2. ;)
 
Pretty interesting results. I'd say it's not bad as just a differential between two cards. But it's assuming full TDP for "light to medium use" for your cards. Not sure what the idle draw on these cards are - that number is harder to find sometimes.
j2c and hardware nexus use devices to specifically measure power draw and have posted numbers.

Still seems off that ltt did their I initial review with 7800x3ds.
 
Still seems off that ltt did their I initial review with 7800x3ds.
They used those because they have a dozen of them they sourced to be similar in performence to allow parallel testing.
 
Is it physx causing that? :)
Funny you should mention that. I was doing some DLSS 4 transformer model ray reconstruction testing this weekend with CP2077 and Hogwarts and noticed NV listed it in their copyright notices before one of those games. It was a blink and miss it moment but I was a little surprised.
 
Become a Patron!
Back
Top