Z's 4090 Thread

I still chuckle a little bit at the buyers remorse I had in 2011 when I bought it as I couldn't believe I had spent such a crazy amount of money on a motherboard/cpu, but 12 years later it is still useful to me, so I think in the end I got my moneys worth :p
I went through the same wondering if I bought something I truly wouldn't use to its potential. I suppose in some ways I never did but in terms of gaming and media management, I did without a doubt. You truly had a nice setup with the custom loop. I always wanted to do it but never got around to developing the skills and investing in it but I did go crazy with fans and achieved a lot there anyways. At first, I didn't even have SSDs because I couldn't afford them and they came about a year or so later. It's funny, that's where I began using 1200W PSUs as well and I'm actually going to pull the one in it still to put in the 3700X rig. Probably tonight because I've got a Noctua 140mm fan that should've been delivered that I'm swapping with the 120mm (a seriously crap fan with major whine) one that came with the NZXT H7 Flow.
 
Alright.

Flashed the BIOS and went over all the settings to make sure RAM timings etc. were right and Resizeable BAR was enabled.

1675210188000.png

Link

Still somewhat disappointing results (but the temps were great! :p )

I turned up the fans a little bit to make the best of it. GPU temp averaged 37C, and peaked at 41.9C

GPU clock went up and down between 2805 as a low and 2820 as a high.

The GPU utilization numbers were pretty bad at times (look at red line):

1675210402270.png

Again, I'm thinking this is probably a Threadripper vs 3DMark hate thing.

Feels weird to complain about a 26k Timespy score with graphics score over 30k though :p

I am going to try some other benchmarks.
 
That is on par and still pretty good.

All the canned benches I use came with the games. I think the Forspoken demo has it and I seem to remember a Final Fantasy one recently but haven't tried it in a while. Otherwise, Metro Exodus, CB 2077, SOTR, and Horizon Zero Dawn all have built-in canned benches.

Oops. I think I accidentally overwrote the default preset in the Metro Exodus Enhanced Benchmark when I was playing around with it. (I didn't realize it would do that)

Do you have the settings for the default "Extreme" preset handy, so I can compare apples to apples? I googled the **** out of it, but no one seems to have posted the default settings anywhere.

I think I restored them to default based on memory, but I am not sure:

1675216069178.png

Is this correct?
 
Meanwhile I set everything to max (I think) as follows:

1675216401134.png


These were my best results:

1675216517500.png

  • Total Frames: 7577; Total Time: 105.1417 sec
  • Average Framerate (99th percentile): 72.25
  • Max. Framerate (99th percentile): 110.46 (Frame: 363)
  • Min. Framerate (99th percentile): 48.06 (Frame: 931)

It's crazy that these settings are pretty much playable now. It doesnt feel like this game came out THAT long ago and when it did, they very much weren't
 
I also ran some Cyberpunk 2077 benchmarks at various settings.

I ran everything at 4k. First I stepped through all the presets. Then I discovered that two settings had settings above the RT Ultra settings called psycho, so I did a maxed out run where I ran Ultra preset plus maxed those two settings at psycho and called it psycho.

Then I realized that I also wanted to see how everything did without DLSS, so I ran those numbers too.

Here are my results, sorted by highest average framerate:

Avg​
Min​
Max​
Low​
142.47​
95.5​
179.09​
Medium​
141.85​
90.13​
176.3​
High​
130.6​
88.29​
165.51​
Ray Tracing: Low (DLSS Auto)​
113.03​
84.57​
158.48​
Ray Tracing: Ultra (DLSS Auto)​
97.84​
58.58​
117.46​
Ray Tracing: Psycho (w. DLSS Auto)​
91.77​
56.68​
118.03​
Ray Tracing: Medium (DLSS Auto)​
90.45​
75.02​
117.5​
Ultra​
82.79​
64.7​
127.7​
Ray Tracing: Low (DLSS Off)​
71.22​
57.33​
92.95​
Ray Tracing: Medium (DLSS Off)​
47.41​
39.41​
57.31​
Ray Tracing: Ultra (DLSS Off)​
42.17​
30.64​
54.07​
Ray Tracing: Psycho (DLSS Off)​
38.5​
29.72​
50.91​

Some weird **** happens with the DLSS results. For instance, RT Ultra and RT Psycho with DLSS both perform better than RT Medium with DLSS. I presume this is because the presets have DLSS set to "auto". I'm guessing it dynamically changes quality settings based on a FPS target? Who knows. But it is weird.

I didn't want to bias myself by looking at review numbers before testing my own, so now I am going to go off and see what kind of numbers testers got, and how mine compare.


EDIT:

Hmm. Compared to this sites review, I did like 2 fps worse at RT Ultra without DLSS, but like 22FPS better with DLSS.

Different drivers, different versions of the game, as well as the fact that this site doesn't rely on the canned benchmark, but rather an open world drivethrough.

I may have to do the latter as a test and see how that compares.
 
Last edited:
I also ran some Cyberpunk 2077 benchmarks at various settings.

I ran everything at 4k. First I stepped through all the presets. Then I discovered that two settings had settings above the RT Ultra settings called psycho, so I did a maxed out run where I ran Ultra preset plus maxed those two settings at psycho and called it psycho.

Then I realized that I also wanted to see how everything did without DLSS, so I ran those numbers too.

Here are my results, sorted by highest average framerate:

Avg​
Min​
Max​
Low​
142.47​
95.5​
179.09​
Medium​
141.85​
90.13​
176.3​
High​
130.6​
88.29​
165.51​
Ray Tracing: Low (DLSS Auto)​
113.03​
84.57​
158.48​
Ray Tracing: Ultra (DLSS Auto)​
97.84​
58.58​
117.46​
Ray Tracing: Psycho (w. DLSS Auto)​
91.77​
56.68​
118.03​
Ray Tracing: Medium (DLSS Auto)​
90.45​
75.02​
117.5​
Ultra​
82.79​
64.7​
127.7​
Ray Tracing: Low (DLSS Off)​
71.22​
57.33​
92.95​
Ray Tracing: Medium (DLSS Off)​
47.41​
39.41​
57.31​
Ray Tracing: Ultra (DLSS Off)​
42.17​
30.64​
54.07​
Ray Tracing: Psycho (DLSS Off)​
38.5​
29.72​
50.91​

Some weird **** happens with the DLSS results. For instance, RT Ultra and RT Psycho with DLSS both perform better than RT Medium with DLSS. I presume this is because the presets have DLSS set to "auto". I'm guessing it dynamically changes quality settings based on a FPS target? Who knows. But it is weird.

I didn't want to bias myself by looking at review numbers before testing my own, so now I am going to go off and see what kind of numbers testers got, and how mine compare.


EDIT:
Hmm. Compared to this sites review, I did like 2 fps worse at RT Ultra without DLSS, but like 22FPS better with DLSS.

Different drivers, different versions of the game, as well as the fact that this site doesn't rely on the canned benchmark, but rather an open world drivethrough.

I may have to do the latter as a test and see how that compares.
I wouldn't worry too much about the discrepancies. You're pretty much where you ought to be now. Of course you can dial things in but at least things seem to be running right now. Congrats!
 
Meanwhile I set everything to max (I think) as follows:

View attachment 2337


These were my best results:

View attachment 2339

  • Total Frames: 7577; Total Time: 105.1417 sec
  • Average Framerate (99th percentile): 72.25
  • Max. Framerate (99th percentile): 110.46 (Frame: 363)
  • Min. Framerate (99th percentile): 48.06 (Frame: 931)

It's crazy that these settings are pretty much playable now. It doesnt feel like this game came out THAT long ago and when it did, they very much weren't
This is pretty much what I set it at but will then use DLSS quality for kicks with the 4090 but its still needed for the 3090 Ti. Yeah, it's awesome to see the 4090 crush this. If you ever try SOTR the trick for testing w/o DLSS is to set SMAA to max, otherwise hair and other things are pretty jaggy without DLSS.
 
I also made some progress tonight. I swapped out my 1000W PSU for the 1200W that was in the 4930K rig and also installed the new 140mm Noctua fan in the 3700X rig. I also thought of a solution for moving the 3090 Ti radiator that I'll probably do on the weekend (things got a bit too cramped for my taste in the H7 Flow but I think I have a good idea to fix it) and then I'll start a new thread w/ some pics and scores. I'm really happy with how this project has turned out.
 
I also ran some Cyberpunk 2077 benchmarks at various settings.

I ran everything at 4k. First I stepped through all the presets. Then I discovered that two settings had settings above the RT Ultra settings called psycho, so I did a maxed out run where I ran Ultra preset plus maxed those two settings at psycho and called it psycho.

Then I realized that I also wanted to see how everything did without DLSS, so I ran those numbers too.

Here are my results, sorted by highest average framerate:

Avg​
Min​
Max​
Low​
142.47​
95.5​
179.09​
Medium​
141.85​
90.13​
176.3​
High​
130.6​
88.29​
165.51​
Ray Tracing: Low (DLSS Auto)​
113.03​
84.57​
158.48​
Ray Tracing: Ultra (DLSS Auto)​
97.84​
58.58​
117.46​
Ray Tracing: Psycho (w. DLSS Auto)​
91.77​
56.68​
118.03​
Ray Tracing: Medium (DLSS Auto)​
90.45​
75.02​
117.5​
Ultra​
82.79​
64.7​
127.7​
Ray Tracing: Low (DLSS Off)​
71.22​
57.33​
92.95​
Ray Tracing: Medium (DLSS Off)​
47.41​
39.41​
57.31​
Ray Tracing: Ultra (DLSS Off)​
42.17​
30.64​
54.07​
Ray Tracing: Psycho (DLSS Off)​
38.5​
29.72​
50.91​

Some weird **** happens with the DLSS results. For instance, RT Ultra and RT Psycho with DLSS both perform better than RT Medium with DLSS. I presume this is because the presets have DLSS set to "auto". I'm guessing it dynamically changes quality settings based on a FPS target? Who knows. But it is weird.

I didn't want to bias myself by looking at review numbers before testing my own, so now I am going to go off and see what kind of numbers testers got, and how mine compare.


EDIT:
Hmm. Compared to this sites review, I did like 2 fps worse at RT Ultra without DLSS, but like 22FPS better with DLSS.

Different drivers, different versions of the game, as well as the fact that this site doesn't rely on the canned benchmark, but rather an open world drivethrough.

I may have to do the latter as a test and see how that compares.


Well, for what it is worth, I did a leisurely drive around Night City in Johnny's Porsche. First I started with the first person in car view, then I remembered that most people probably drive third person, so I did that for a while.

I didn't have FRAPS or anything like that installed to collect statistics, so I just looked at the Rivatuner charts. Less busy scenes went up as high as 120fps, but the busiest parts of the city went down to the mid 60's. If I had to guess, i would say I averaged about 90fps

Notably, while I occasionally hit the power limit in Time Spy, I never hit it in actual game in Cyberpunk. In the drive through I peaked at about 340W. Mostly the limiting factor appears to be the "voltage limit"

Looks like its behaving like it is supposed to. I still need to clean things up here and take some of the final pics, but then I can start reading up on overclocking :p
 
Notably, while I occasionally hit the power limit in Time Spy, I never hit it in actual game in Cyberpunk. In the drive through I peaked at about 340W. Mostly the limiting factor appears to be the "voltage limit"
That's kind of what I was saying before, re: power limits. Timespy is a bit unrealistic in comparison to actual gameplay for power draw. I'm not saying that I haven't seen some big spikes but usually, it hangs 340ish - 430 ish and will rarely, and quite briefly go upwards and over 500w.

On a side note I tested the 3090 Ti (I'm using the 4x 8-pin adapter that came with my 4090 since I have an actual 12VHPWR connector with the PSU in that rig) and saw it hit 475W for a second or two and while leading up to that it hung ~450-468W.
 
I found an odd way in which the 4090 behaves very differently than either my RX 6900xt or my Pasxal Titan X before it.

I was getting great frame rates in my tests, but the graphics were still a stuttery mess, across several titles. I tried a million things in troubleshooting and then eventually found that G-Sync (compatible) wasn't working properly.

I noticed this because the GUI in the monitor reports the refresh rate it is currently operating at, and when Free/G/VRR/Whatever is working properly the number bounces around with the current frame rate, but for some reason it was stuck at the screens max refresh rate, 120hz, while the game was rendering in the 75-110 range. That explains the jaggen motion, but why?

Over the years I have become used to having a Rivatuner statistics window as part of MSI Afterburner (the one with all the charts) open on one of my side screens so I can monitor system performance while a game is running. By random accident I discovered that when MSI Afterburner was open and the Rivatuner stats were on my right side screen VRR stopped working and the screen was pinned at 120hz. When I closed Afterburner and the Rivatuner window thus went away, VRR would work perfectly.

Really odd. Never had this issue with previous GPU's. VRR worked fine in the past regardless of what was being displayed on my non-game monitors.

Had anyone else experienced anything like this?
 
I found an odd way in which the 4090 behaves very differently than either my RX 6900xt or my Pasxal Titan X before it.

I was getting great frame rates in my tests, but the graphics were still a stuttery mess, across several titles. I tried a million things in troubleshooting and then eventually found that G-Sync (compatible) wasn't working properly.

I noticed this because the GUI in the monitor reports the refresh rate it is currently operating at, and when Free/G/VRR/Whatever is working properly the number bounces around with the current frame rate, but for some reason it was stuck at the screens max refresh rate, 120hz, while the game was rendering in the 75-110 range. That explains the jaggen motion, but why?

Over the years I have become used to having a Rivatuner statistics window as part of MSI Afterburner (the one with all the charts) open on one of my side screens so I can monitor system performance while a game is running. By random accident I discovered that when MSI Afterburner was open and the Rivatuner stats were on my right side screen VRR stopped working and the screen was pinned at 120hz. When I closed Afterburner and the Rivatuner window thus went away, VRR would work perfectly.

Really odd. Never had this issue with previous GPU's. VRR worked fine in the past regardless of what was being displayed on my non-game monitors.

Had anyone else experienced anything like this?
I haven't exactly experienced this, due to not doing the same, but I have experienced issues with other overlays between Precision X1, AB, and Tweak, causing issues. It's kind of all over the map on what might trigger something but AB/Riva is still my favorite in terms of the least amount of disruptions. Steam has even caused issues on occasion (I don't use the FPS but even letting its overlay be available can cause issues sometimes).
 
Zarathustra, I bought the MSI Supreme x Liquid 4090, after giving up trying to buy the sapphire 7900 xtx.
I also decided to EKWB the card and I was bitching about the price of it (expensive in Canadian dollars) that I had to cut the thermal pads my self. For that price it should come pre cut as there were some different curls to deal with.
Anyhow. I wanted to mention to you that when I booted up windows and looked at msi afterburner in stock form the power level was I think 93 or 97% and resetting the card to stock brought it back to 100%

There is a bios update for the card from MSI to fix this and maybe other issues as well. I.e. I can go to 110% from 107% power level.

I am currently on my older computer as I figured I'd upgrade my computer for once after a very long few years (6700k i7) and I took the chance on the new AMD 7000, and well... not impressed so far. I am waiting for memory now.
bought a 7600 while waiting for the 3d chip to show up. and the G Skill 6000mhz 30-38-38-98 (timings might be wrong) won't run past 4800mhz
In a friends other board it will run at 6000mhz but when he tries to run 3dmark it crashes.
Anyhow enough side tracking here.
 
It's interesting how the heat output over these things changes things.

My philosophy used to be that the ideal water loop flow is one in which all points in the loop measure roughly the same in temperature.

If it's hotter after your water block, that's when you know you could benefit from more flow.

Currently I have ~ 1.1 gpm across my GPU and CPU. It goes in at about 33C and cones out the other side at 34.5C.

This was not the case in the past. Now granted, this is across both a high power GPU and a pretty hot CPU, but still!
 
So here are the settings I'm pretty much using all the time now when gaming. The core ends up clocking ~2805-2820 MHz and temps hang in 50-55c and, so far after 20-30 hours of testing, have been totally stable. I've tinkered with lowering fan speed to 70% and it mostly does the same but temps rise ~55-60 and clocks rarely go above 2805 MHz at that point. The memory boost is just for the heck of it and it's really debatable if that extra 500 MHz is doing anything but hey, it's nice to see 22 GHz happening with it. I also experimented with leaving at stock and then keeping the fans at 70% and saw nearly the same performance. In terms of FPS gains, maybe just a few over stock but the power draw doesn't really end up being that much more either as I see it hanging 319-419W most of the time with some occasional spikes. In very rare and brief situations I'll see it go over 500W but it's by no means the norm. I've mostly been testing with actual gameplay of Witcher 3 Next Gen, Metro Exodus Enhanced, CB 2077, and Dead Space Remake, and just a little of RE2/RE3 Remake and RDR2, so quite a variety of game engines.

Meanwhile, since this GDDR6X memory at stock (10500 MHz) is the same speed as that used in the 3090 Ti, and this ADA GPU is so much more powerful even with mem at stock, it makes no sense to OC the mem on the 3090 Ti (sorry Zarathustra to side track a bit here), so my recommendation for those few people who have a 3090 Ti is to focus on overclocking the GPU only and not even bother with memory since you'll never get that Ampere GPU even close to an overclocked ADA which can do its thing with the same memory speed of 10500 MHz.
1676639428186.png
 
So just a heads up that voltage adjustment can absolutely have a positive effect on the 4090, and 3090 Ti cards. I've been experimenting I'm just under 3 GHz, stable, for gaming on the 4090 (I was able to go over 3 GHz for some benching but it wasn't stable for gaming) and matching what Brent achieved in his 3090 Ti review at 2160-2175 Mhz. Both cards are using around ~450 watts and thanks to the hybrid liquid cooling solution they have they're holding 60-64c at stock fan settings. There are moments when they use less and more power but mostly I saw them hanging around 430-450w. This time around testing was done while playing The Witcher 3 and Hogwarts, and then the canned benches for CB2077/Metro Exodus.

As always use caution when adjusting voltage but at least for me it was the only way to get to the higher clocks w/o crashing. It was probably the silicon lottery that prevented me from just upping the clocks and power limit to get there. Air cooled cards can probably do about the same at the expense of cranking the fans a bit higher but thanks to the huge heatsinks these cards often have I could see it happening. I've got a 3090 Ti identical to the one that @Brent_Justice used in his review in the 4930K rig and I'll do some more testing with it next weekend.
 
So just a heads up that voltage adjustment can absolutely have a positive effect on the 4090, and 3090 Ti cards. I've been experimenting I'm just under 3 GHz, stable, for gaming on the 4090 (I was able to go over 3 GHz for some benching but it wasn't stable for gaming) and matching what Brent achieved in his 3090 Ti review at 2160-2175 Mhz. Both cards are using around ~450 watts and thanks to the hybrid liquid cooling solution they have they're holding 60-64c at stock fan settings. There are moments when they use less and more power but mostly I saw them hanging around 430-450w. This time around testing was done while playing The Witcher 3 and Hogwarts, and then the canned benches for CB2077/Metro Exodus.

As always use caution when adjusting voltage but at least for me it was the only way to get to the higher clocks w/o crashing. It was probably the silicon lottery that prevented me from just upping the clocks and power limit to get there. Air cooled cards can probably do about the same at the expense of cranking the fans a bit higher but thanks to the huge heatsinks these cards often have I could see it happening. I've got a 3090 Ti identical to the one that @Brent_Justice used in his review in the 4930K rig and I'll do some more testing with it next weekend.

What are you using to up the voltage? My slider is greyed out in Afterburner.
 
Become a Patron!
Back
Top