NVIDIA Reportedly Launching GeForce RTX 3080 Ti with 12 GB of GDDR6X Memory in April

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,222
Points
113
nvidia-geforce-rtx-3080-close-up-1024x576.jpg
Image: NVIDIA



NVIDIA will reportedly launch its way-oft-rumored GeForce RTX 3080 Ti graphics card in April. The news comes from VideoCardz and popular hardware leaker kopite7kimi, who claims that the Titanium variant of the flagship gaming GPU has undergone some significant downgrades since the initial leaks that came out in December. Apparently, the GeForce RTX 3080 Ti will only feature 10,240 CUDA Cores (80 Streaming Processors) as opposed to 10,496. Memory has also been reduced from 20 GB to 12 GB of VRAM, which suggests that NVIDIA made a concentrated effort to distance the SKU even further from its BFGPU, the GeForce RTX 3090 (10,496 CUDA Cores, 24 GB GDDR6X).



“The most recent update to the NVIDIA product roadmap lists GeForce...

Continue reading...


 
12 20 whatever not like it will actually see market availability. Or a msrp price worth a ****.
 
Well, that's disappointing. Means I'm skipping this gen even if there would be actual availability.
 
I never seen a card come up so many times that it will come out, then cancelled, and then coming out, then delayed, and now here we are a few months away again.
 
This, at least, makes a little more sense. It never seemed totally plausible they'd release a Ti variant so close to their premier card. Sure, we've seen some unusual things in the past when SLI was still a thing and you could buy two TIs and achieve much but we all know that time is gone. Releasing a 3080 Ti that achieved nearly all of the 3090 never really made sense. I'd only suggest that it should be closer to 14 or 16 GB instead of 12 but that's another matter and perception based on demanding games at 4K.

At this point though none of it really matters since no one can really buy anything right now. It takes luck or giving in to a scalper plus the added costs of the new tariffs. If you factor in various supply issues it only gets worse.
 
I have mostly written off the 3000 series cards personally.
I play my fav games fine with what I got, and bragging rights dont mean **** to me so yeah...../shrug

I'm impulsive tho, so if the right stuff drops at the right time, then, who knows?

Also I have been more focused on some personal art projects as of late, so gaming has become very casual for me.
No complaints.
 
This, at least, makes a little more sense. It never seemed totally plausible they'd release a Ti variant so close to their premier card. Sure, we've seen some unusual things in the past when SLI was still a thing and you could buy two TIs and achieve much but we all know that time is gone. Releasing a 3080 Ti that achieved nearly all of the 3090 never really made sense. I'd only suggest that it should be closer to 14 or 16 GB instead of 12 but that's another matter and perception based on demanding games at 4K.

At this point though none of it really matters since no one can really buy anything right now. It takes luck or giving in to a scalper plus the added costs of the new tariffs. If you factor in various supply issues it only gets worse.
why do people keep forgetting that memory size is TIED to bus width. At 384 bit the max memory size is 12GB.
 
I have mostly written off the 3000 series cards personally.
I play my fav games fine with what I got, and bragging rights dont mean **** to me so yeah...../shrug

I'm impulsive tho, so if the right stuff drops at the right time, then, who knows?

Also I have been more focused on some personal art projects as of late, so gaming has become very casual for me.
No complaints.
Are you... me?

I feel uncomfortable about all this.
 
why do people keep forgetting that memory size is TIED to bus width. At 384 bit the max memory size is 12GB.
guess that's why the 3090 and RTX TItan have 24 GB.

That being said it seems the only way to get to 16 is to backtrack to 256 which, of course, doesn't make sense. Only other alternative is to double up to 20 per previous rumors at 356.
 
guess that's why the 3090 and RTX TItan have 24 GB.

That being said it seems the only way to get to 16 is to backtrack to 256 which, of course, doesn't make sense. Only other alternative is to double up to 20 per previous rumors at 356.
20GB would still be a 320-bit bus. All GDDR is 32-bits per channel.
 
guess that's why the 3090 and RTX TItan have 24 GB.

That being said it seems the only way to get to 16 is to backtrack to 256 which, of course, doesn't make sense. Only other alternative is to double up to 20 per previous rumors at 356.
From what I understand, once you go past 384 bit, it becomes much more complicated and expensive, more so at high bandwidth speeds. That's why we don't see 512bit bus cards any more.
 
From what I understand, once you go past 384 bit, it becomes much more complicated and expensive, more so at high bandwidth speeds. That's why we don't see 512bit bus cards any more.
I think the only reason 512-bit was a thing was because AMD wanted 4GB on their video card when GDDR only went up to 2Gb per chip. We have 16Gb GDDR now, it's just in very short supply and more expensive than just doubling the 8Gb chips on a video card at the moment.
 
From what I understand, once you go past 384 bit, it becomes much more complicated and expensive, more so at high bandwidth speeds. That's why we don't see 512bit bus cards any more.
Matrox... Parahelia. Remembered the name while I was thinking of how I was going to say that I was too lazy to look it up. So spelling is almost certainly off...

Matrox traded hard on the '512-bit' thing. 2002-ish. Supposed to be their 'comeback' after 3Dfx showed the world what was possible with a higher-end, gaming-focused... 3D Accelerator. They weren't 'GPUs' then.

Failed spectacularly. Matrox never recovered. Whatever market they had has been eaten by Quadros, though as I've seen them around, I assume they exist only in the product segments that even AMD doesn't bother to compete in. Many-monitor outputs and the like I think.
 
Matrox... Parahelia. Remembered the name while I was thinking of how I was going to say that I was too lazy to look it up. So spelling is almost certainly off...

Matrox traded hard on the '512-bit' thing. 2002-ish. Supposed to be their 'comeback' after 3Dfx showed the world what was possible with a higher-end, gaming-focused... 3D Accelerator. They weren't 'GPUs' then.

Failed spectacularly. Matrox never recovered. Whatever market they had has been eaten by Quadros, though as I've seen them around, I assume they exist only in the product segments that even AMD doesn't bother to compete in. Many-monitor outputs and the like I think.

I remember the Parhelia 512, but I it actually had a 256bit memory bus.
I really thought its Fragment AA was going to catch on, but it faded away almost instantly.

I think the last card with a 512bit interface was the radeon 390, but don't quote me on that one.
 
I remember the Parhelia 512, but I it actually had a 256bit memory bus.
I really thought its Fragment AA was going to catch on, but it faded away almost instantly.

I think the last card with a 512bit interface was the radeon 390, but don't quote me on that one.
Yes, it used the same memory controller from the 290. Only it used sixteen 4Gb memory chips to reach 8GB instead of 2Gb.
 
I remember the Parhelia 512, but I it actually had a 256bit memory bus.
I really thought its Fragment AA was going to catch on, but it faded away almost instantly.
Was that a DDR marketing thing then?

I'm hesitant to try searching just due to all the misinformation that's certain to get in the way.

And I think Matrox was just riding the exposure wave from saying something like 512-bit out loud. They had to know that their product wasn't going to deliver anything close to what it was being hyped up to be by the community at the time, and while it was probably great at things that Matrox did great back then, they really did seem to crash hard and fast afterward.
 
Was that a DDR marketing thing then?

I'm hesitant to try searching just due to all the misinformation that's certain to get in the way.

And I think Matrox was just riding the exposure wave from saying something like 512-bit out loud. They had to know that their product wasn't going to deliver anything close to what it was being hyped up to be by the community at the time, and while it was probably great at things that Matrox did great back then, they really did seem to crash hard and fast afterward.
The 512 bit spec does not refer to the memory bus width, but the chip architechture, like the Riva128 was 128 bit, the geforce 256 was 256 bit, etc. That was in the ancient days of BS (Before Shaders) :D ;);)
 
Become a Patron!
Back
Top