Is 8GB or 10/12GB vram not enough?

Ranulfo

Sort-of-Regular
Joined
Nov 11, 2019
Messages
185
Points
43
I thought I'd post a new version of this thread from AT over here since the original poster hasn't been around on AT for awhile.

People's thoughts for 2025 and beyond? How long will 10/12GB handle 1080p with the advertised bells and whistles that sell cards today?

A work in progress of the original links to videos proving the case that 8GB is too low of vram since 2021(original posted links below by BFG10K on AT):

8GB examples:

Dying Light 2, Battlefield 2042, Avatar Frontiers of Pandora, Call of Duty Modern Warfare 3, Starfield, Forza Horizon 5, Horizon Forbidden West, Alan Wake 2, The Last of Us, Tekken 8, FC 24, Detroit Become Human, Death Stranding, Alone in The Dark, Overwatch 2
3060 12GB higher 1% low than 4060 8GB
Cyberpunk 2077, Diablo 4, Hogwarts Legacy, Horizon Zero Dawn, Microsoft Flight Sim 2020, Witcher 3, Watch Dogs Legion skip to 4K + DLSS for each game
Ratchet & Clank, Plague Tale Requiem, Last Of Us, Jedi Survivor, Far Cry 6, Hogwarts Legacy, Forza Horizon 5, Cyberpunk 2077
Far Cry 6, Godfall, Hogwarts Legacy, Forza Horizon 5, Forspoken, Doom Eternal, Watchdogs Legion
Last Of Us, Resident Evil 4, Callisto Protocol, Plague Tale Requiem, Halo Infinite, Forspoken
Last Of Us, Hogwarts Legacy, Resident Evil 4, Forspoken, Plague Tale Requiem, Callisto Protocol https://youtu.be/Rh7kFgHe21k?t=288
Hogwarts Legacy, Last Of Us, Spider Man Remastered, Doom Eternal, Forza Horizon 5 https://youtu.be/R943CbDTq_s?t=1473
Horizon Forbidden West 4060 has issues @ 1080p + DLSS unless you switch off frame generation and reduce settings to high https://youtu.be/3AyKvI23VGw?t=233
Horizon Forbidden West 3070 requires reducing textures 2 notches https://youtu.be/xTwEGy6HHKo?t=351
Deathloop https://youtu.be/lu_XHyO-6zY?t=918
Ghost Recon Breakpoint https://youtu.be/iUGppQVpXzU?t=529
Resident Evil 2 https://youtu.be/Aa-gsgRjMII?t=796
Talos Principle 2 https://youtu.be/K6FATQAuEwI?t=337
Spider Man Remastered https://youtu.be/vRVR_plVSlc?t=1060
Witcher 3 https://youtu.be/vRVR_plVSlc?t=1240



10/12 examples:




Reasons why still shipping 8GB since 2014 isn't NV's fault:

  • It's the player's fault.
  • It's the reviewer's fault.
  • It's the developer's fault.
  • It's AMD's fault.
  • It's the game's fault.
  • It's the driver's fault.
  • It's a system configuration issue.
  • Wrong settings were tested.
  • Wrong area was tested.
  • Wrong games were tested.
  • 4K is irrelevant.
  • Texture quality is irrelevant as long as it matches a console's.
  • Detail levels are irrelevant as long as they match a console's.
  • There's no reason a game should use more than 8GB, because a random forum user said so.
  • It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  • It's an anomaly.
  • It's a console port.
  • It's a conspiracy against NV.
  • 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  • It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  • Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
 
My observation based on the analysis by HUB, Daniel Owen et al

With transformer model, it seems that 1080 max can be tested & viable at 1080p quality & 1440p balanced

However vram limit applies in following scenarios even then
  1. PS5 ports
    1. Spiderman-2
    2. Ratchett & clank
    3. Horizon zero dawn
  2. Open world games that use nanite like tech
    1. UE5 — Oblivion Remastered
    2. Resident Evil Engine — Monster Hunter Wilds
    3. Cry Engine — Kingdom Come Deliverance 2
    4. Anvil engine — assassin's creed shadows
    5. Snowdrop engine — star wars outlaws
  3. Older games that are 'unoptimized' & will never be optimized in future
    1. Forza Horizon 5
  4. Games with adjustable texture cache pool size
    1. ID tech — Indiana Jones
  5. Games with insufficient texture
    1. Black Myth Wukong — it seems that game uses lower quality textures & sharpens blur to compensate for this
 
I bought a 1060 6GB a decade ago, it's ridiculous that we're still at 8GB cards at this point- especially with the memory pressure from raytracing.
 
I bought a 1060 6GB a decade ago, it's ridiculous that we're still at 8GB cards at this point- especially with the memory pressure from raytracing.
Probably just for next 6 months

Nvidia will release super series sometime next year. Maybe $350 5060 super 12gb by June next year ??

AMD's 9060xt 8gb is not selling. As soon as stocks empty they might sell it as 9060 non-xt 8gb for $250 to $280

In 2 years AMD is supposedly going all lpddr6 on low end. So you'll have a true 3060 12gb replacement for $250 or so, as lpddr6 is cheap & doesn't have vram limitations
 
I bought a 1060 6GB a decade ago, it's ridiculous that we're still at 8GB cards at this point- especially with the memory pressure from raytracing.
With TSMC prices of wafers going up all the time they have to cut costs somewhere to keep their huge margins I suppose
 
With TSMC prices of wafers going up all the time they have to cut costs somewhere to keep their huge margins I suppose
Imagine this line-up (in 2027)

  • AT0
    • 10090xt+ — Multiple models starting at $1500 plus and huge vram like Radeon VII or titan
  • AT1
    • 10080xt — scrapped (Lisa Su took her toys & went home)
  • AT2 (gddr7)
    • 10070 xtx 24gb? = $700 (~5080)
    • 10070 xt 18gb? = $600 (~5070 ti)
    • 10070 gre 15gb? = $500-$550 (~5070 super)
  • AT3 (lpddr6)
    • 10060 xt 24gb = $450-$500 (~5070)
    • 10060 16gb = $400 (~5060ti 16gb)
  • AT4 (lpddr6/lpddr5x)
    • 10050xt 32gb = $350 (~9060xt 16gb)
    • 10050xt 24gb = $300 (~9060)
    • 10040xt 16gb = $250 (~3060 12gb in raster)
 
Last edited:
I mean that would be nice but if it's 2027 they will need to compete with whatever is next from Nvidia. And it's already known that Nvidia just only has to give the tiniest of deer turds for QC so their Time To Launch is accelerated.

The only thing that might change that is if Nvidia has 'learned' their lesson among their oodles of overvalued AI BS and actually makes a solution where power is managed per pin as a requirement with a safety shutoff built in as a REQUIREMENT for their AIB's.

I just don't see that happening.
 
You guys (should!) all know that this comes down to a compromise between bill of materials and the cost of having buses wider than 128bit for VRAM. That's why 8GB has dominated as the 'floor' for so long, and also why with 3gb GDDR ICs coming we can start seeing 12GB cards on 128bit buses.

Or 18GB cards on 192bit buses that were previously limited to 12GB.

Or 24GB cards on 256bit buses that were previously limited to 16GB.

The alternative of course has been to double the memory; however, such cards typically were more expensive than the next tier up, while being slower in every situation than where they were limited by VRAM capacity. 12GB and 16GB *60-class cards were typically not good values outside of the creator space. The latest 5060Ti I believe is perhaps the first overall exception.

Which means that they didn't sell well, which means that the ROI for AMD and Nvidia for these odd GPUs is just not there. ROI is how products get greenlit and made available for customers to buy, not enthusiast whining! But you know that, right?



Further, and more importantly - these 8GB cards do sell. They sell to folks that aren't playing AAA titles that 'just need a GPU', and to gaming cafes, at the least. And for these usecases they excel - because they're fast enough and they're the cheapest cards available.
 
You guys (should!) all know that this comes down to a compromise between bill of materials and the cost of having buses wider than 128bit for VRAM. That's why 8GB has dominated as the 'floor' for so long, and also why with 3gb GDDR ICs coming we can start seeing 12GB cards on 128bit buses.

Or 18GB cards on 192bit buses that were previously limited to 12GB.

Or 24GB cards on 256bit buses that were previously limited to 16GB.

The alternative of course has been to double the memory; however, such cards typically were more expensive than the next tier up, while being slower in every situation than where they were limited by VRAM capacity. 12GB and 16GB *60-class cards were typically not good values outside of the creator space. The latest 5060Ti I believe is perhaps the first overall exception.

Which means that they didn't sell well, which means that the ROI for AMD and Nvidia for these odd GPUs is just not there. ROI is how products get greenlit and made available for customers to buy, not enthusiast whining! But you know that, right?



Further, and more importantly - these 8GB cards do sell. They sell to folks that aren't playing AAA titles that 'just need a GPU', and to gaming cafes, at the least. And for these usecases they excel - because they're fast enough and they're the cheapest cards available.
I mean really... if you're not gaming or doing AI work... or some heavy HEAVY Cad work... you really don't need that much GPU... though... my laptops business RTX card can barely handle an animated background for my teams call using broadcast with eye focus. ;) (only 6 gig of vram)
 
I mean really... if you're not gaming or doing AI work... or some heavy HEAVY Cad work... you really don't need that much GPU... though... my laptops business RTX card can barely handle an animated background for my teams call using broadcast with eye focus. ;) (only 6 gig of vram)

Sadly, given current software and its bloated code and overly pretty graphics we are moving away from basic graphics capable chips (especially built in to cpu ones) being a good deal anymore. My 8th gen intel i5 laptop is showing signs of gpu problems far earlier than the 4th gen I5 laptop did.
 
8GB isn't enough for Ultra settings at 1080p in modern games. It's no longer just about resolution, but game settings really affect VRAM usage now, even at lower resolutions like 1080p. Even at Medium or High settings, depending on the game, it isn't enough.

The fact is, 8GB of VRAM is holding back game developers from progressing game graphics. The minimum spec has to increase. Consoles have been on 16GB (yes, I know it's shared) for 5 years now. That is your indication right there. If game devs are targeting 16GB consoles, and PCs 5 years later are still having 8GB of VRAM on GPUs, what do you think is going to happen? Exactly what we've been experiencing in gameplay today.

8GB should be for $199 cards and below
12GB should be $200-$300
16GB should be for $300+

Consider also the longevity that people own a GPU, if you are having issues with 8GB, even 12GB now, imagine what that will be in 2, 3, 4 years with that same GPU.
 
Even 16GB doesn't feel very future proof. If next gen consoles come with 32GB of unified RAM, then things might get pretty rough for 16GB pretty quickly. 8GB is useless these days.
 
24GB+ is what I would target if you want a video card able to keep up/out perform the next gen of consoles and keeping it for 5+ years.
I think that is the salient point; what performance is being targeted.

Prefacing my remarks: I am a single player and co-op gamer only. No competitive MP of any kind.

I have been messing around with a 4 year old Asus Dual 3060 12GB from my son's system. As I swapped him for a RX 7800XT. DLSS TM works so well, even when forced to start from 1080 in the most demanding games, that it is extending the card's useful life. I am sitting 8ft from a 55" 4K gaming TV AKA the console experience. I can also use RT reflections and turn up textures, thanks to the 12GB. Makes for better IQ than being forced to 1080 medium because the frame buffer is too small.
 
Become a Patron!
Back
Top