Z's 4090 Thread

Zarathustra

Cloudless
Joined
Jun 19, 2019
Messages
3,732
Points
113
Hey,

I figured I'd start my own thread rather than pull the 4070TI review even further off topic.

If anyone cares about my progress with this GPU, I'll post it here.

First, I'll quote the history from the other thread:

Makes the $1900 I just spent on my 4090 seem cheap. This sucks. I don't mind the top-tier cards costing a chunk, although not thrilled, but these should not be like that at all. I remember being jealous of the folks that got Titans back in the day while I scrambled SLI with 560Ti's and 980s but this is just wrong.

I was thinking the same thing.

I have an MSI Gaming X Trio 24G and an EK waterblock incoming.

I'm sure you're going to love it. The 4090 is everything they make it out to be. I was hesitant at first and originally planned to wait it out for the Ti/Titan or whatever comes next but Brent's review won me over, and I really don't want to wait 18+ months in hopes of getting one. If I hadn't gotten lucky in getting mine I probably would've just held out but I am happy. Not only does it significantly outperform the 3090 Ti but manages to do so at the same or less power in a lot of testing I've done. For giggles, I've been using mild OC settings (clock at 2860-2885 MHz and mem at 11,000 MH w/ power +107%) and in most games, the liquid X keeps it 60 or under w/ 30-50% fans while AB says its drawing 385-425W. Just to hammer it I tested RE3 Remastered and cranked IQ to 200%, saw it eat 18GB of VRAM, and then pulled 500-530W with the same settings while holding 50-55 FPS at 4K. The game looked incredible at that point too. I did a lot of testing with it and once you slap that block on it'll take an already impressive card to another level. I tested Crysis Remastered and that was fun too. It really is the best 4K card yet and based on all these power comparisons for the other Ampere cards I am impressed with the design but it really sucks with what is happening with supplies.

Yeah, I was thinking similarly. I only very recently spent money on a rather expensive 6900xt, but I knew that was a risk.

The 4090 appears as if it is an all holds barred approach to getting the most out of this gen, and because of that I think it might just have some staying power. I'm cautiously optimistic for a similar experience to my Pascal Titan X I had from 2016 to 2021.

Admittedly I held on to that Titan a little but longer than I had planned. I balked at the 2080ti when the "space invaders" problem reared its ugly head, and by the time it was solved, I thought it was way too late in the product cycle to spend big bucks on a top end card.

With the 4090 - however - I'm hoping I will be happy for at least 2-3 years.

Well, after some more research, I may have bought the wrong version of the 4090 for my intents. :/

Apparently the MSI Gaming X has a power limit of ~450w, which is pretty wimpy as 4090's go.

Beggars can't be choosers though. The only ones I could find in stock without feeding scalpers too much were this one, and a Zotac model, and I'm not sure I trust Zotac for high end boards.

I'm still putting a water block on it. Hopefully I can achieve some decent overclocks by undervolting :/

Pretty sure you'll be able to hang in the 2800s still. From what I've read anything over 500w is mostly wasted anyways. If you trust it enough you might be able to flash custom bios for more but I wouldn't recommend it. Yeah, when I started researching these I was really shocked about all the wattage variants that are out there and even then there are other tradeoffs with them. The GIGABYTE with its AIO is one of the highest but came in 2nd on some tests. Either way, at stock these things are still beasts and the OC gains are minimal at best and the extra power draw isn't usually equitable. I'd say the settings I'm using are close to 1:1 but probably still using a bit more power than needed vs just keeping it at stock.

I hope so.

Some reviews I read had one up at ~2930 pulling about 430w, so I feel like ought to be able to at least match that under water, but I don't have a good feel for how power limited these are yet, or if the limit lies elsewhere.

Zero regrets buying mine also. It's an awesome card.

Unless you're racing benchmarks, ~400W is probably the efficient limit. I'll note that even my 3080 12GB, not even a Ti card, has 3x8pin power and can pull 450W with power limits raised. It's still never going to outrun a 3090.

I'm starting to wonder if I am ever going to get this 4090.

My UPS tracking number was created on Tuesday (01/03) and it is still sitting in "label created" status and listed as "on the way" with delivery due on Monday, but no detailed steps in the progress since then :/

I have Paypal buyer protection on it, so I know one way or another I'm not going to be out any money, but I have an EKWB water block on the slow boat from Slowvenia. If this 4090 winds up resulting in a refund, no guarantees the next one I find is the same model that fits the same waterblock :/ International waterblock returns are likely not happening. Probably can sell it, but still, that would be a pain in the ***.

Let's hope it's one of those borked UPS things. I've seen some occasions with them and USPS/FedEx(a lot with both of these) where tracking doesn't get updated for one reason or another until the day of delivery, and sometimes not even then until after it's delivered. Fingers crossed for ya!

Was having the issue where the label was created but sat for days without any sign of pick up or anything then all of a sudden it was out for delivery. I think this time of year really messes with the tracking system. I had my Maximus motherboard's tracking do the same thing recently.

Yeah, I've had those too, but I've also had ones where they just disappeared en route. Fingers crossed!

Aaahhh, so you decided to move on from the 6900 XT already huh?


I know you game at 4K, but I feel like a 4090 should last you decently longer than that.

Alright, with that out of the way, I got a surprise knock at the door today. It was UPS.

As @Niner51 suggested, without any tracking history it showed up at my door, two days earlier than predicted at that, so of course I had to play with it a little.

I don't have my water block yet, so I am not sticking it in my main rig, but this is what I ahve a testbench machine for, to test new hardware offline before I take the server or desktop down for an upgrade. And I definitely want to test and make sure the GPU is working properly before I take the cooler off and install the water block when it arrives.


I just noticed I didn't take any pictures of the GPU out of the system. I will have to remedy that when the waterblock arrives.

Anyway, pictures do not do the size of these 4090's justice. This thing is HUGE.

The Phantecs Enthoo Pro my test bench machine resides in is a large case, and it still wasnt entirely straight forward to get it in.

01.jpg

I had to remove the drive bays, and slide the GPU in at an angle into the space they used to sit in, then press it into the slot.

02.jpg

03.jpg


So, I bet this is one of the more unusual CPU/GPU combinations that is now in the TimeSpy Database:

04.png
 
This illustrates just how dated the Sandy-Bridge-E has become:

05.png

The CPU score is pretty bad, lol. Its even worse than it would have been back in 2011 (if Timespy had existed then), because back then there were no Spectre/Meltdown mitigations, and back then I had this thing overclocked to 4.8Ghz. Now it is at stock, and the turbo maxes out at 3.8Ghz.

The GPU score isn't great either, but I suspect it is being held back quite a lot by the CPU. I expect this will improve once I transfer it over to my main desktop.

Just for ****s and giggles I ran a time spy extreme as well.

06.png

I had hoped to load it up more to give it a better stress test and verify good working order, but it seems the aging i7-3920k just isn't up to the task :p

One thing I did notice was how astonishingly quiet this thing is. I guess that is one of the benefits of this enormous heatsink. Still, I was expecting it to be much worse, as the MSI Gaming X version I ahve is one of the few 4090's on th emarket that doesn't ahve a vapor chamber, and instead uses a traditional heatsink.

When I ran the regular timespy the fans didn't even turn on until halfway through the benchmark. It went through the entire demo without fans. The fans only kicked on about halfway through the first graphics benchmark.

Now, this might in part be because the CPU just isn't allowing the GPU to load itself up enough to need it.

For reference, this is at a just under 70F ambient (69.8F according to my not terribly accurate desktop thermometer)

After this discovery, I noticed this little switch on the side of the GPU:

07.jpg

Oh, maybe that's why its so quiet. Out of the box, the default seems to be "silent".

I switched it to "gaming" and ran some more benchmarks.

This time to try to minimize the CPU limitation, I set the 2560x1440 screen to run in 4x DSR, so 5120x2880, and maxed out ALL benchmark settings, and ran some Unigine Heaven, Unigine Valley, and Unigine Superposition.

To my ears it was still virtually silent in the "gaming" setting. The max I ever saw the fans run at was 31% and at this speed the fans are very very quiet. Again, grain of salt here, because the CPU may be holding it back to the point where it just doesn't need to spin up the fans any faster.
Max GPU temp I saw in these tests was ~mid 60's
 
I was curious how hot the backplate would get so during the stress tests mentioned in my previous post, I grabbed my little FLIR camera to see if I could get a good image of the back of the board.

I had forgotten the testbench had a big tower air cooler in the way, so the images are not of the flat backplane as I had envisoned, but they still tell the story.

08.jpg09.jpg


High resolution FLIR cameras are very expensive, so mine is only 320x240. I scaled it up a little for a better look though.

This is when I had the switch in gaming mode, but left the fans on auto. The view is looking down on the GPU from the side of the tower cooler. The red blob in the background is the PSU.

For comparison, this is roughly the angle these were taken from:

1674967349073.png

As you can see, the backplate does in deed get pretty hot. 51.9C and 51.6C according to my cheap FLIR. it felt quite hot to the touch too.

Maybe I should have tried to get an Active Backplate cooler after all?

Then I thought, this level of cooling is not reflective of what it is going to see with a water block on it, so I did another series of tests with the GPU fans manually set to 100%

I ran all three Unigine benchmarks again, in 4x DSR at 5120x2880 gain, with all settings (including AA) maxed out.

This time it looked like this:

10.jpg

Note how the scale is dynamic, not fixed, so the hottest spot has shifted to the PSU, which is ~37.4C. The backplate is now much cooler, though I couldn't tell you the exact temp.

This leads me to believe that an active backplate will not be necessary with watercooling, and won't add much benefit. Most of that backplate heat in the previous pics seems to just be migrating from the GPU itself when the fan speeds are low.

Interestingly enough, with the fans pinned at 100%, the highest GPU core temp I saw recorded was 43C, which is pretty impressive.

Again, maybe only because the CPU sucks and doesn't let it load up more, but still.
 
Last edited:
Congrats @Zarathustra !

I ran timespy for the 1st with mine last night.

1673183018397.png


This is just using those mild overclock settings and stock fan curves

1673183094504.png

Stock CPU and AIO settings. I probably could've gotten a few more points here if I maxed the fan/pump but the settings here for the CPU and GPU are what I use while gaming and the rig is near silent for most things. I almost never see the CPU drop below 4450 MHz while gaming.

1673183258065.png
 
This at least gives you a taste of the CPU bottleneck that can happen with these cards. I started seeing it, even at 4K 60 Hz, with my 4930K when I paired it with a 3090 with games that use more advanced raytracing features, and it only got worse when I put a 3090 Ti with it. With my scores, you can see how well the 5800X3D plays with it and if/when you get a 7000 series 3D I'm sure you'll see a huge jump.

p.s. I had a similar experience with my 3090 Suprim X in installing it in a HAF X case. I couldn't believe it. I was able to wedge it in but not exaggerating that it was within 1 or 2 mm of the drive cage. I almost had to figure out a way to remove it because it wasn't necessarily designed to be removable like those in other cases.
 
Oh, maybe that's why its so quiet. Out of the box, the default seems to be "silent".
The last 3 or 4 AIB NV cards I've gotten have come that way (MSI/ASUS/EVGA). I usually just leave it since I'm a bit more focused on silent running these days but tbh I'm skeptical as to how much different either setting does on various cards. At one point with either my Suprim X or Strix 3090 I remember switching it to gaming when I had the rig in a bench-type setup and didn't notice much of a difference for noise or clocks when using my OC settings. That reminds me, here's what I'm using with the liquid X, and despite the +110% it mostly hangs in the 380-425 W range. I do occasionally see it jump to 480-530W but it's very game/test specific. For me, these settings, so far, have been 100% stable in testing with close to a dozen different games using different engines. I had pushed the core a bit higher but some games crashed.

1673184402754.png
 
Last edited:
The last 3 or 4 AIB NV cards I've gotten have come that way (MSI/ASUS/EVGA). I usually just leave it since I'm a bit more focused on silent running these days but tbh I'm skeptical as to how much different either setting does on various cards. At one point with either my Suprim X or Strix 3090 I remember switching it to gaming when I had the rig in a bench-type setup and didn't notice much of a difference for noise or clocks when using my OC settings.

That pretty much matches my experience.

That reminds me, here's what I'm using with the liquid X, and despite the +110% it mostly hangs in the 380-425 W range. I do occasionally see it jump to 480-530W but it's very game/test specific. For me, these settings, so far, have been 100% stable in testing with close to a dozen different games using different engines. I had pushed the core a bit higher but some games crashed.

View attachment 2170

I'm going to hold off on overclocking and tweaking until I have the water block on there and it's in the proper machine with a CPU that can keep up with it.

I think I am going to up the core clock and voltage until it either becomes unstable or runs into the power limit, then back it off a little and see how far I can reduce the voltage.

Is that pretty much the approach you used?

Right now the Rivatuner stats list it as being voltage limited whenever I'm in a benchmark. Afterburner does not appear to have the ability to boost voltage. I vaguely remember having to change a setting somewhere to allow adjusting voltage, but it has been a really long time since I did this. (When I set up and overclocked my Pascal Titan X in 2016) so I can't really remember.

I'll have to read up on it again.
 
Also, at least in theory, power use should be lower with lower core temps.

Electrical resistance should be is lower the cooler the chip runs.

Because of this, at least in theory it ought to pump out less power under water.

No idea if it is actually significant enough of a difference to be noticeable though.
 
We should have our 4090 overclock article up very soon, so that should help. In general, boosting voltage is not helpful as it ramps up power use, and for the past few generations power has been the ultimate limiting factor.
 
Forgot to mention.

This thing is so huge. it's officially a 3 slot design, but the heatsink sticks out another slot width wise, so it actually consumes 4 slots.

I had to remove my 10gig NIC to make it fit, which means I'm using on board networking for the first time in YEARS.

I also had to unplug the front panel USB3 port, as the GPU interfered with the connector. (you know, those large rigid USB3 internal header plugs.)

Hopefully with the waterblock installed, it will be much more reasonable in size.
 
Forgot to mention.

This thing is so huge. it's officially a 3 slot design, but the heatsink sticks out another slot width wise, so it actually consumes 4 slots.

I had to remove my 10gig NIC to make it fit, which means I'm using on board networking for the first time in YEARS.

I also had to unplug the front panel USB3 port, as the GPU interfered with the connector. (you know, those large rigid USB3 internal header plugs.)

Hopefully with the waterblock installed, it will be much more reasonable in size.
Yeah, no matter how many people mention how huge these newer cards are, it's something you have to see to believe. They do have decent cooling for air cards but that's because of how huge that cooler is.

I will say that EVGA seemed to have a better handle on it as their air-cooled 3090 Ti was a little smaller than the Suprim X 3090 (this is almost the same size as many 4090s) while outperforming it as it should. I am so missing them now but the Suprim X Liquid 4090 is a nice 2-slot card and that 240mm AIO does better than I expected.
 
Yeah, no matter how many people mention how huge these newer cards are, it's something you have to see to believe. They do have decent cooling for air cards but that's because of how huge that cooler is.
Fully agree. Pictures online, and people mentioning how large they are don't really do them justice. They are shockingly large when you actually get your hands on them.
 
Yeah, no matter how many people mention how huge these newer cards are, it's something you have to see to believe. They do have decent cooling for air cards but that's because of how huge that cooler is.

I will say that EVGA seemed to have a better handle on it as their air-cooled 3090 Ti was a little smaller than the Suprim X 3090 (this is almost the same size as many 4090s) while outperforming it as it should. I am so missing them now but the Suprim X Liquid 4090 is a nice 2-slot card and that 240mm AIO does better than I expected.
This is the card I want if I get to spend that kind of money. But might just be a 4080 this year or a 42 inch c2.
 
I'm currently sporting a 3080 12GB card, and it's handling my 1440 needs. I'm skipping the 4000 series completely unless they release a card that beats my 3080 for under $600... @Zarathustra : Enjoy the beast of a card... Should last you a very long while!

Time will tell.

I demand minimum (1%/0.1%/whatever) framerates above 60fps at 4k max settings in order to be happy. I don't get much time for this hobby anymore, so if I am going to enjoy a game, I want to do it the way the developer intended, without the compromises and cut-backs lower settings result in.

I'm also not a fan of DLSS/FSR. I want native resolution and native frames at all times. I'll use it in a pinch, but when I have to start turning DLSS on in order to get minimum framerate above 60fps at 4k with all settings turned all the way up, that's when it is time to shop for another GPU :p

With a little luck I'll get 2-3 years out of it.
 
Time will tell.

I demand minimum (1%/0.1%/whatever) framerates above 60fps at 4k max settings in order to be happy. I don't get much time for this hobby anymore, so if I am going to enjoy a game, I want to do it the way the developer intended, without the compromises and cut-backs lower settings result in.

I'm also not a fan of DLSS/FSR. I want native resolution and native frames at all times. I'll use it in a pinch, but when I have to start turning DLSS on in order to get minimum framerate above 60fps at 4k with all settings turned all the way up, that's when it is time to shop for another GPU :p

With a little luck I'll get 2-3 years out of it.
So are you installing that 6900xt in a secondary system? when I got my 6800xt I sold my 2080 to someone for 400 bucks.
 
I'm also not a fan of DLSS/FSR. I want native resolution and native frames at all times. I'll use it in a pinch, but when I have to start turning DLSS on in order to get minimum framerate above 60fps at 4k with all settings turned all the way up, that's when it is time to shop for another GPU :p
I've been using it whenever possible since I got my 2080 Ti and have mostly been happy but now it feels like taking training wheels off for the 1st time. In each game that I've used it, I've been benching again to see the uber framerates (just so fun to see 4K upwards or over 120 FPS) and then I do another run w/o and I've been amazed when a game can run with full RT effects and still hold a 60ish rate. So far I've seen that happen with SOTR and SM Remastered and maybe 1 or 2 two other games. Things do look better and it's nice to see latency even more lessened but I still give praise for the DLSS quality setting in its current state but it sure is nice to begin walking away, at least for some games.
 
This thing is so huge. it's officially a 3 slot design, but the heatsink sticks out another slot width wise, so it actually consumes 4 slots.
My Gigabyte 4090 is a two slot card that is as wide as a three slot card. I've been very impressed by the cooling on this card and all 4090's from what I've read.
 
Become a Patron!
Back
Top