Report: NVIDIA to Relaunch GeForce RTX 2060 with More Memory (12 GB) by January 2022

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,595
Points
113
nvidia-geforce-rtx-2060-angled-close-up-1024x576.jpg
Image: NVIDIA



NVIDIA launched the GeForce RTX 2060 on January 15, 2019, as part of its revolutionary generation of Turing GPUs that introduced ray tracing to the gaming mainstream. New rumors suggest that the card will be returning with some improvements due to lame market conditions.



According to sources with VideoCardz, NVIDIA will be re-releasing the GeForce RTX 2060 with 12 GB of GDDR6 memory. This is twice the amount of memory that the original GeForce RTX 2060 Founders Edition featured (6 GB) and 4 GB more than what can be found in the GeForce RTX 2060 SUPER, a faster version of the card that was introduced on July 9, 2019.



The new GeForce RTX 2060 with 12 GB of GDDR6 memory will reportedly be released by January 2022.



[…] NVIDIA is now planning an updated RTX 2060 GPU based on Turing architecture. A model known as PG161 would feature 12GB GDDR6...

Continue reading...


 
This makes zero sense unless they just have parts laying around to assemble cards with and they know people will buy them.
 
I'd have said this would be great for certain productivity tasks with the extra memory, but it's pointless as these will end up with miners as well.
 
Its a 12nm part - there may be some space open at 12nm at TSMC for them to make more. Basically, if Nvidia can get the fab space at different nodes, they may be able to make 1050tis at 16nm, 2060s at 12nm, 30X0 at Samsung, and 40X0 at 5nm at TSMC. They will sell everything they make, and won't need to do any real development work to start making them again, so why not?
 
Probably a better option than some of the $300+ 1050TI's I've been seeing. Just depends on how much these things end up selling for.
 
Why waste memory on it? Just so theres more shortage of memory? That may be it.
 
Similar rumours have been floating around for a while now, all have been fake so far.
 
Why waste memory on it? Just so theres more shortage of memory? That may be it.
It’s possible the industry has moved to a larger size per module, making it more economical to do 12gb cards
 
Become a Patron!
Back
Top