Sources: NVIDIA GeForce RTX 3090 Replaces RTX 3080 Ti with 24 GB of GDDR6X RAM

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,575
Points
113
jdsp-nvidia-geforce-rtx-3080-render-angled-1024x576.jpg
Image: JDSP



According to VideoCardz’s AIB sources, NVIDIA won’t be launching a GeForce RTX 3080 Ti (or, at least, not yet). The GeForce RTX 3090 is definitely taking its place, and it’ll flaunt an incredible 24 GB of GDDR6X memory. The GeForce RTX 3080, on the other hand, will only have 10 GB of RAM.



“We only have the capacities confirmed, but we can assume that RTX 3090 features a 384-bit bus and RTX 3080 gets 320-bit,” the author added. “The memory speeds for both cards are expected to be at least 19 Gbps.”



This report suggests that those rumors of NVIDIA ditching the Ti (Titanium) label are true. Presumably, the SUPER branding will be used from here on out for future, more powerful GPU variants.



VideoCardz has also heard that NVIDIA will be...

Continue reading...


 
I guess that could also confirm that they're finally using 16Gb memory chips, at least on the 3090, since GDDR6 is 32 bits per channel.
 
Are they still planning a Titan as well or is this 80Ti and Titan love child?
 
Open those wallets... Should be a beast though.
 
Hmm..

970 - 148W
1070 - 150W
2070 - 175W
and now
3070 - 220W

Granted, the 770 was a 230W card, so maybe I'm just reading too much into it.. but really makes me think they are pushing hard to get rasterization improvements. I predict we will see claims of huge numbers, like 200% performance increase, of which will only pertain to raytracing and/or DLSS. Actual rasterized performance... I'm betting on 20%.

But maybe I'm wrong. There is a die shrink in there now, but that's a huge power bump to go along with a die shrink... and usually you don't push power envelopes past what you need to be competitive, as it just drives up cooling requirements and costs.
 
Last edited:
Smarter to wait for the super and/or a hopeful response from AMD?
 
Hmm..

970 - 148W
1070 - 150W
2070 - 175W
and now
3070 - 220W

Granted, the 770 was a 230W card, so maybe I'm just reading too much into it.. but really makes me think they are pushing hard to get rasterization improvements. I predict we will see claims of huge numbers, like 200% performance increase, of which will only pertain to raytracing and/or DLSS. Actual rasterized performance... I'm betting on 20%.

But maybe I'm wrong. There is a die shrink in there now, but that's a huge power bump to go along with a die shrink... and usually you don't push power envelopes past what you need to be competitive, as it just drives up cooling requirements and costs.

The die shrink also allows you to get more performance by increasing your transistor budget for lack of a better term. That's what AMD did. At 7nm, there was a significant reduction in power consumption, but ultimately, power consumption is similar to Zen+ in practice. However, the design had more transistors, L3 cache and more performance.
 
The die shrink also allows you to get more performance by increasing your transistor budget for lack of a better term. That's what AMD did. At 7nm, there was a significant reduction in power consumption, but ultimately, power consumption is similar to Zen+ in practice. However, the design had more transistors, L3 cache and more performance.
Was going to say. We don't know what Ampere's cores look like, yet, so even though the 3080 has the same number of cores as the 2080 Ti the transistor count and architecture is going to look different. We also don't yet know how many RT cores each GPU has.
 
Become a Patron!
Back
Top