MSI GeForce RTX 4090 SUPRIM LIQUID X 24G Video Card Review

David_Schroth

Administrator
Staff member
Joined
Apr 23, 2019
Messages
1,214
Points
113
Introduction On our test bench, today is the MSI GeForce RTX 4090 SUPRIM LIQUID X 24G video card. It sits at the top of MSI’s 4090 product stack and has the same boost clocks as its air-cooled sister card. The key features of the MSI GeForce RTX 4090 SUPRIM LIQUID X 24G is the high factory overclock (105MHz boost clock increase), a 480W power limit, and a 240mm AIO cooling solution. The RTX 40 Series GPUs have been a bit easier to find in the retail environment compared to the 30 Series, though the RTX 4090-based cards tend to not […]

See full article...
 
That is the 4090 that is the MOST tempting to get.
 
Thanks, @David_Schroth for the great review. Any plans to test DLSS 3 aka frame generation? I've recently been testing CB2077 and Hogwarts with it and then turning DLSS AA off.

I cannot even begin to say how much I love mine. Quiet, powerful, and just gets the job done. I've been dreaming of AIO graphics cards for years and am so happy to finally be lucky enough to have them. I'm also truly shocked at how well this setup works with a mere 240mm solution as well. I wouldn't have thought it possible if I hadn't seen it with my own eyes.

Following your advice, I did some voltage tweaking and was able to get to a completely stable 2910-2955 MHz, 23.2 GHz mem. I think I lost out a bit on the silicon lottery as even though I can get it a bit higher, it's just not stable 24/7 at that point. I have even been using the stock fan curves (but do max for benching and testing) while gaming with the OC settings and rarely see it go above 64c inside the Carbide. With the fans maxed it hangs in the low 50s after putting it through the rounds for a couple of hours. I did a timespy run last night with my current OC settings and got 28,651. I'm sure with that 24 GHz and higher clock you might even break 30,000, In terms of power usage I'm seeing the exact same behavior.

It's also hilarious to see the boost clock jump up to over 2800 Mhz simply by increasing the power or voltage.
 
Any plans to test DLSS 3 aka frame generation? I've recently been testing CB2077 and Hogwarts with it and then turning DLSS AA off.
I think Brent commented on it in one of the 40 series articles that he's done. I haven't had a chance to tinker with it myself - I've got a couple other reviews to knock out in the pipeline at the moment in other departments...
 
Is the copper water block plated? Copper block + aluminum radiator is not a good combination.
 
Is the copper water block plated? Copper block + aluminum radiator is not a good combination.
I checked the website and yeah. I'm totally aware of what you probably mean, galvanic corrosion and the things associated with it. We'll see, maybe there's something magic in the fluid to prevent, and minimize it for the long term. I expect mine to get a lot of hours and use it for years to come so I'll speak up if I see anything. I've also got a 3090 Ti Hybrid that I'm pretty sure is the same that's about 6-months older than this. I know there are nightmare scenarios but after reading some newer posts across the internet it's not always a worst-case scenario either.
 
If I could afford one, this is the one I would get.

I'm currently on a RTX 3070Ti, but I got a raise recently, maybe I could get a RTX 4070Ti.
At this point, I'm getting nervous about NV's generational improvements. As much as I truly love having this card I'm extremely annoyed that we're starting to see a kind of every other gen hardware addition to their cards in order to use a new feature (i.e. 20-series for RT and Tensor cores, 40-series for DLSS3-frame gen), and now there was that story yesterday about RT overdrive in CB 2077. Odds are this series will support it but I wouldn't put it past them to add yet something else for whatever comes next. I'm all for more hardware being added to cards but it's beginning to feel less innovated and more staged/scheduled for max profits.

It's a foil hat theory for sure but I'm it does seem a bit different than the 20-30% perf increase per gen we were seeing from the 700 series to the 20 series. I'm all for more hardware being added to cards. We've known for decades that a single GPU was becoming overwhelmed with everything being thrown at it and mGPU/SLI/Crossfire had its issues so whether by chiplets or designated cores, it was obvious something needed to be done to offload some of that work but I'm not into a model of piecemealing one thing at a time every other gen. It's well known that in big industry you have multiple pipelines of R&D and release schedules planned out but I have doubts about how necessary it would be to roll out a new hardware addon in this fashion. I hope I'm wrong.
 
$1750 o_O

That is used Honda Civic territory
haha. That used to be true but in my neck of the woods, that's now the down payment. I could definitely derail this thread but used cars out here are not happening for under $2K unless they have over 200K on them. Although I hear rumors the used car market is supposedly dropping again I'll believe it when I see it. Meanwhile, we're schlepping along in an '09 forester (142K) and '02 Grand Caravan (165K) and I'm looking to get a replacement for the van this summer. It's one of the reasons I opted to revive my 3700X build instead of doing the 7000X3D build I was considering for a while.
 
At this point, I'm getting nervous about NV's generational improvements. As much as I truly love having this card I'm extremely annoyed that we're starting to see a kind of every other gen hardware addition to their cards in order to use a new feature (i.e. 20-series for RT and Tensor cores, 40-series for DLSS3-frame gen), and now there was that story yesterday about RT overdrive in CB 2077. Odds are this series will support it but I wouldn't put it past them to add yet something else for whatever comes next. I'm all for more hardware being added to cards but it's beginning to feel less innovated and more staged/scheduled for max profits.

It's a foil hat theory for sure but I'm it does seem a bit different than the 20-30% perf increase per gen we were seeing from the 700 series to the 20 series. I'm all for more hardware being added to cards. We've known for decades that a single GPU was becoming overwhelmed with everything being thrown at it and mGPU/SLI/Crossfire had its issues so whether by chiplets or designated cores, it was obvious something needed to be done to offload some of that work but I'm not into a model of piecemealing one thing at a time every other gen. It's well known that in big industry you have multiple pipelines of R&D and release schedules planned out but I have doubts about how necessary it would be to roll out a new hardware addon in this fashion. I hope I'm wrong.
It's not tinfoil hat theory at all because there's an obvious pattern and nVidia does not like supporting or updating anything for cards more than a generation old. In many cases there's little or no reason for that support to be basically dropped. AMD has been managing to do very similar if not the same things with support not only for older AMD cards but most current and older nVidia cards as well as Intel's new cards. FSR has proven that. nVidia is obviously using specialized and nVidia dedicated hardware to do some of these things but that doesn't mean it must be done that way with cutting off support except for the absolute latest cards. nVidia stinks of planned and unneeded obsolescence with the goal being to force new hardware purchases and force people into a specific locked down ecosystem.
 
It's not tinfoil hat theory at all because there's an obvious pattern and nVidia does not like supporting or updating anything for cards more than a generation old. In many cases there's little or no reason for that support to be basically dropped. AMD has been managing to do very similar if not the same things with support not only for older AMD cards but most current and older nVidia cards as well as Intel's new cards. FSR has proven that. nVidia is obviously using specialized and nVidia dedicated hardware to do some of these things but that doesn't mean it must be done that way with cutting off support except for the absolute latest cards. nVidia stinks of planned and unneeded obsolescence with the goal being to force new hardware purchases and force people into a specific locked down ecosystem.
To think a company that bought 3dfx just to get the tech and then dissolved the company in an accessive manner would treat it's non business customers any other way.

They have used their influence and position in the market to abuse it's customers long enough that the bruises haven't even faded before they go for new ones. It's a reason AMD is successful in the video card market.
 
To think a company that bought 3dfx just to get the tech and then dissolved the company in an accessive manner would treat it's non business customers any other way.

They have used their influence and position in the market to abuse it's customers long enough that the bruises haven't even faded before they go for new ones. It's a reason AMD is successful in the video card market.
Make that patents, not technology. I had the chance to ask Tom Petersen when he was still with nvidia what technologies from 3dfx made it to nvidia back in the day. He said that by that time nvidia was already way beyond 3dfx technology wise so very little if anything made it into nvidia products other than SLi and that's only in the name.
 
To think a company that bought 3dfx just to get the tech and then dissolved the company in an accessive manner would treat it's non business customers any other way.

They have used their influence and position in the market to abuse it's customers long enough that the bruises haven't even faded before they go for new ones. It's a reason AMD is successful in the video card market.

I think that was mostly to squash a competitor. And for 3dfx's brass to cash out before the company went totally bankrupt.

That was a sad time, and I knew people that worked there. That was a mgmt c*ckup for the record books... right up there with Commodore.
 
I think that was mostly to squash a competitor. And for 3dfx's brass to cash out before the company went totally bankrupt.

That was a sad time, and I knew people that worked there. That was a mgmt c*ckup for the record books... right up there with Commodore.
Fair, they were a local company for me too. Wouldn't have minded working for them but they didn't need sysadmins at the time.

though I did run a pair of I believe Voodoo II cards.
 
I think that was mostly to squash a competitor. And for 3dfx's brass to cash out before the company went totally bankrupt.

That was a sad time, and I knew people that worked there. That was a mgmt c*ckup for the record books... right up there with Commodore.
Competitor was already squashed, nvidia just gave 3dfx a "Fatality"
 
I think Brent commented on it in one of the 40 series articles that he's done. I haven't had a chance to tinker with it myself - I've got a couple other reviews to knock out in the pipeline at the moment in other departments...
I just finished up a review data on a MSI 4090 Suprim X non-liquid.
I ran some Frame Generation using the Cyberpunk 2077 benchmark...........it was crazy fast.......I think the Frame Generation was nearly double the DLSS -alone FPS and that was pretty fast. I was using Ultra ray tracing. It looked amazing too.

I have to say, these cards are so good now, the only way youre going to sell the halo models is to keep adding the tweaks. It takes so long for a game developer to make something crazy to challenge the already existing hardware, there's not really a reason to upgrade. Look at what has come out this year....Dead Space remake, no Ray Tracing. Atomic Heart....UnReal engine 4, not 5,4 and no Ray Tracing although the game was touted for a couple years as a killer app for RTX.
CDPR has been the only developer to challenge the RTX cards, and now with a 4090 you can run over 100 frames with all the bells and whistles.
 
I will grant you DLSS 3 increases framerate, but I will also state that is all it does. Remember, the frames are not generated by the render pipeline or game engine, they are simulated frames. In addition, latency and frametime is added. To get the most from the user not experiencing visual bugs or oddities, you basically need to be operating in the hundreds of Hz of frequency to begin with with it. Therefore, it is not good for lower base framerates, but rather high base framerates, which defeats the purpose to begin with. The input latency is increased, and will never be the same as pure DLSS 2 or no DLSS. The input latency between the user and frametime, is not translated to framerate, and thus your eyes are seeing one thing, and your body and brain are feeling another, and it is a disjointed gaming experience at best. All of this can be avoided by just not using frame generation, with any upscaling, frametime and latency will instantly be better than without, vastly improving the gameplay experience in a more meaningful way.
 
Become a Patron!
Back
Top