MSI GeForce RTX 3080 Ti Shipping Labels Confirm 12 GB of GDDR6X RAM

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,595
Points
113
nvidia-geforce-rtx-3080-flipped-interface-1024x576.jpg
Image: NVIDIA



Contrary to earlier reports suggesting that NVIDIA’s GeForce RTX 3080 Ti graphics cards have only just entered mass manufacturing, a user on Facebook has shared images of shipments from MSI confirming that plenty of units of the highly anticipated Titanium variant have already been produced. What’s even better is that the labels on the images confirm the memory configuration of the GeForce RTX 3080 Ti. Despite previous speculation about 20 GB or 16 GB of RAM, NVIDIA’s new Ampere flagship GPU will feature just 12 GB of GDDR6X memory—2 GB more than the standard GeForce RTX 3080, but quite a bit less than green team’s BFGPU, the GeForce RTX 3090 (24 GB).



msi-geforce-rtx-3080-ti-ventus-3x-12g-oc-shipping-label-491x1024.jpg




NVIDIA GeForce RTX 3080 Ti is the upcoming enthusiast...

Continue reading...


 
That would be a decent card, if ever it's seen by the masses, shame it'll have the MSI upcharge
 
That would be a decent card, if ever it's seen by the masses, shame it'll have the MSI upcharge
Shoot, I thought they were the cheap brand. Got my 1080Ti with Corsair AIO pre-installed at Nvidia's base 1080Ti MSRP, now going strong in my brother's desktop. Can't believe I got that back then either.

Anyway, I'm interested in these. I hate that this isn't going to be 20GB+ (i.e., double), but at this point, I just want a GPU to play current games with the stuff turned up on my 38" monitor.
 
There's barely any performance difference between the 3080 and the 3090 at resolutions the vast majority of people use, namely 1080 and 1440, so maybe the people who'd pay extra for this must want it purely as a cheaper alternative to the 3090 for 4k gaming...?
 
Clearly the people who are still stuck on 1080p are not in the market for highend gpus. They should spend on a display first.
 
There's barely any performance difference between the 3080 and the 3090 at resolutions the vast majority of people use, namely 1080 and 1440, so maybe the people who'd pay extra for this must want it purely as a cheaper alternative to the 3090 for 4k gaming...?
Clearly the people who are still stuck on 1080p are not in the market for highend gpus. They should spend on a display first.
Could just be they want more VRAM for other projects?

I'm fine with 10GB or 12GB just for gaming. It's the video editing that I makes me feel like that's 'constraining'!
 
For 4K gaming, this will make a bit more sense since 12 GB is fine for the vast majority of them. 10 GB was a little sketchy as I've seen a number of GPU intensive games hover around there or higher. I'd say it's also likely the MSRP will likely be very close to the 3080 so the only real challenge will be for gamers to get one. However, at anything less than 4K, this card won't really benefit most people except for those really trying to crank out more FPS, and even then the 3080 ought to be really close there too.
 
Could just be they want more VRAM for other projects?

I'm fine with 10GB or 12GB just for gaming. It's the video editing that I makes me feel like that's 'constraining'!
If you want to to "other" things you should buy the 3090, that's what nvidia is telling us at least.
 
This is much more appealing than the RTX 3080 was for those with an RTX 2080 Ti. I know I was put off from the 3080
Now prices on the other hand....
 
If you want to to "other" things you should buy the 3090, that's what nvidia is telling us at least.
To a degree at least. Plenty can be done with less, but for me, the biggest concern is that I keep my GPUs for three release cycles. I know whatever I buy today won't be top of the line in three or four years, I just want to make sure that it's still useful!
 
To a degree at least. Plenty can be done with less, but for me, the biggest concern is that I keep my GPUs for three release cycles. I know whatever I buy today won't be top of the line in three or four years, I just want to make sure that it's still useful!
I know maybe two people that use 3-4 year old gear for video editing on an at least semi serious level and theyre on Mac's.
Everyone else involved in any kind of production replaces/upgrades every 2 years or so.
A friend used to teach Post Video at NYU, they keep their stuff longer at EDU's.
 
If I were editing, uh, "professionally" i.e. for paid work, I'd be upgrading more often too, GPUs and supporting infrastructure. Like, I'd have a TR with maxed RAM and at least one PCIe x16 slot filled with four striped NVMe drives just for scratch speed.

On my less-professional desktop I'm mostly just concerned with buying something that doesn't have a 'soft' EOL built-in. A powerful GPU with too little VRAM to handle say 8k footage reliably would be on that list.

Now, I don't know that to be true, the above example is a purely theoretical supposition on my part - I'd be doing the research if the occasion of a potential purchase were to occur and today I don't even know if that'll even happen this year :D
 
I'd be doing the research if the occasion of a potential purchase were to occur and today I don't even know if that'll even happen this year :D

Ive written off 2021 and the RTX 3K series personally. I can hang until better days and RTX 4K
 
Ive written off 2021 and the RTX 3K series personally. I can hang until better days and RTX 4K
I guess I'm pretty much there with you bro. I'd snap a 3080Ti up at MSRP today, if that's 3080 MSRP +US$100 like it should be, but if availability doesn't improve or actual retail prices stay at MSRP * 1.5, then I'll hang with you kicking and screaming but hang nonetheless ;)
 
For 4K gaming, this will make a bit more sense since 12 GB is fine for the vast majority of them. 10 GB was a little sketchy as I've seen a number of GPU intensive games hover around there or higher. I'd say it's also likely the MSRP will likely be very close to the 3080 so the only real challenge will be for gamers to get one. However, at anything less than 4K, this card won't really benefit most people except for those really trying to crank out more FPS, and even then the 3080 ought to be really close there too.

I'd argue that 16GB would make the most sense in terms of 4K gaming. It would allow for more headroom for gamers who mod games or enable features that consume more VRAM. I've been able to get over 13GB of VRAM allocated to Cyberpunk 2077 as an example. I don't know that 12GB would perform differently using the same settings as allocation doesn't equate to actual usage. That being said, if I can get games to consume close to 12GB of VRAM, then I'd rather have a bit extra if the card is going to have a long service life.

We've had 11 and 12GB cards for a number of years now. Going all the way back to the Maxwell Titan X's. I think 12GB or less is anemic for a ultra-high end gaming card.
 
I'd argue that 16GB would make the most sense in terms of 4K gaming. It would allow for more headroom for gamers who mod games or enable features that consume more VRAM. I've been able to get over 13GB of VRAM allocated to Cyberpunk 2077 as an example. I don't know that 12GB would perform differently using the same settings as allocation doesn't equate to actual usage. That being said, if I can get games to consume close to 12GB of VRAM, then I'd rather have a bit extra if the card is going to have a long service life.

We've had 11 and 12GB cards for a number of years now. Going all the way back to the Maxwell Titan X's. I think 12GB or less is anemic for a ultra-high end gaming card.
Oh I totally agree. Honestly thought that 16GB was going to be what they did but . . .oh well. At least that extra 2GB should make a bit of a difference from 10 GB. It's a pretty wacky thing all around. At least with this there's a chance they can keep the cost close to the same, MRSP anyways.
 
I'd argue that 16GB would make the most sense in terms of 4K gaming. It would allow for more headroom for gamers who mod games or enable features that consume more VRAM. I've been able to get over 13GB of VRAM allocated to Cyberpunk 2077 as an example. I don't know that 12GB would perform differently using the same settings as allocation doesn't equate to actual usage. That being said, if I can get games to consume close to 12GB of VRAM, then I'd rather have a bit extra if the card is going to have a long service life.

We've had 11 and 12GB cards for a number of years now. Going all the way back to the Maxwell Titan X's. I think 12GB or less is anemic for a ultra-high end gaming card.
I'm using VRAM for professional tasks, and you can never really fully utilize the physical ram, you start having problems after going over 65-70%. Probably due to fragmentation but that's just my guess. So I was hoping for more as well. 11->12GB is not a worthwhile upgrade for me.
 
Become a Patron!
Back
Top