Godfall Requires 12 GB of VRAM at 4K with Ultra HD Textures, Which Could Pose a Problem for NVIDIA’s GeForce RTX 30 Series

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,075
Points
83
godfall-key-art-1024x576.jpg
Image: Counterplay Games



The minimum and recommended PC specifications that we shared for Godfall yesterday hinted that the fantasy looter-slasher would require some pretty beefy hardware, especially at higher graphics settings.



This has been confirmed in a new video starring Counterplay Games CEO Keith Lee, who made some very interesting comments about the kind of hardware gamers will need to experience Godfall at its maximum fidelity.



Lee revealed that Godfall requires 12 GB of VRAM to run at the 4K resolution setting with Ultra HD textures enabled. That’s bad news for NVIDIA’s...

Continue reading...


 
Will wait to see actual benchmarks before I fall into the "Sky is falling"

I mean, after all, if you had wanted to play with 4K MAX ULTRA SUPER settings, you wouldn't have skimped with "just" a 3080 and you would have shucked out the cash for the 3090, right?

Also, I'm sure DLSS will save this.
 
Will wait to see actual benchmarks before I fall into the "Sky is falling"

I mean, after all, if you had wanted to play with 4K MAX ULTRA SUPER settings, you wouldn't have skimped with "just" a 3080 and you would have shucked out the cash for the 3090, right?

Also, I'm sure DLSS will save this.
Except NVIDIA has been saying that the 3080 is the 4K card and the 3090 is an 8K card.
 
I kind of hope it does 100% require 12GB of VRAM at max settings to see how NVIDIA responds to people after they promised that 10GB would be fine.
That said I am more on the side that it allocated 12GB of VRAM and requires much less.
 
Whether poorly optimized or truly needing it, I've been seeing a few games skim around, or over, the 10 GB mark for years now in 4K. This isn't that much of a surprise. It is a shame that NV made the 3080 with 10 GB because it will see limits with other games I'm sure. The rumored 20 GB variant is probably going to seem a lot more appealing as other games go over 10 GB as well. Meanwhile, I'd say the current 3080 is probably a fantastic 1440P card and will do fine with less demanding titles in 4K.
 
I play at 4k with texture heavy games. Vram size is an issue for me. I've maxed out the 11gb in my 1080 ti - not going to pay good money go backwards. So long as benchmarks look decent I will likely be picking up an AMD card. I think the price of the 3090 is ridiculous.
 
Yea I've leaned more and more to a 6800xt. Will make it waster to go 4k when the time comes.
 
Question with Vram heavy games is if they need it or fill it because it's there. Now this one might actually need it, but some games don't realy, will see what happens, might go for an AMD one for the memory if they can be found, may need to get a freesync monitor too then.
 
If a game is trying to handle more textures that fit into Vram then the system has to keep swapping things out, taking the time to load new textures as needed from main ram or the hard disk - this can cause slow downs, stalls, and stuttering.
 
Much like SQLserver for example, many games will take as much memory as possible, that doesn't mean it actually needs it.

That said, 10GB does fill short, even for 4K
 
Will be hilarious when this unoptimized clusterfuck of a game takes up all the VRAM regardless of amount.
 
Will be hilarious when this unoptimized clusterfuck of a game takes up all the VRAM regardless of amount.

Hardly. Just because most games don't use that much doesn't mean that utilizing 12GB of VRAM is a problem. Godfall may very well employ features, textures, or is in some way designed to utilize extra VRAM. Modders have been using extra VRAM since the beginning of texture modding. Every texture I created for ME3 mods were 4x the size of their originals. Do that enough, and you can eat up a lot of VRAM. I'm not saying that's what's going on here, but it's ridiculous to assume that "being unoptimized" is the only reason a game could possibly use 12GB of VRAM at 4K.
 
Hardly. Just because most games don't use that much doesn't mean that utilizing 12GB of VRAM is a problem. Godfall may very well employ features, textures, or is in some way designed to utilize extra VRAM. Modders have been using extra VRAM since the beginning of texture modding. Every texture I created for ME3 mods were 4x the size of their originals. Do that enough, and you can eat up a lot of VRAM. I'm not saying that's what's going on here, but it's ridiculous to assume that "being unoptimized" is the only reason a game could possibly use 12GB of VRAM at 4K.
Well, at this stage assumptions is all I got.
 
Well, at this stage assumptions is all I got.

That's my point. Without having more information, assuming that using 12GB of VRAM at 4K comes down to a lack of optimization is premature. There is simply no reason to think that at this time.
 
That's my point. Without having more information, assuming that using 12GB of VRAM at 4K comes down to a lack of optimization is premature. There is simply no reason to think that at this time.
As is assuming its because of the 4k textures.
 
As is assuming its because of the 4k textures.
The developer stated that using 4k textures uses 12gb... It's not as much of an assumption as assuming that it's just unoptimized.
A 4k texture takes about 128mb+ depending on features (RGB alpha, displacement, surface normal, etc). So, if you have 100 textures, that's 12+ gb alone.. not including frame buffer, internal render buffers, not including any geometry or other required data... Which means in reality it's probably closer to 50 4k textures, which doesn't seem that far out there for a scene. Maybe they won't all be on screen simultaneously and there will only be small infrequent dips in performance... Who knows until we actually have testing done. Either way, if it's texture size or poor optimization, the only thing that matters to a gamer is how it runs on their hardware.
 
Either way, if it's texture size or poor optimization, the only thing that matters to a gamer is how it runs on their hardware.


And that's the main issue here. The games runs poorly even without RTX/ultra settings. We'll have to see how it performs on the 6000 series to find out if its really bad programming or the RTX series just can't handle it.
 
I never said or assumed it was due to 4K textures. I simply used modding games as an example of how higher resolution textures consume more VRAM. I simply pointed out that there are other causes besides: "the game isn't optimized".
I get it, its just that there are so many examples of bad optimization specially on PC ports, that I just had to go with the odds... :D :D :rolleyes::rolleyes:
 
I get it, its just that there are so many examples of bad optimization specially on PC ports, that I just had to go with the odds... :D :D :rolleyes::rolleyes:

Well, its a definite possibility. But, I am not going there without having played the game or seeing it run on modern hardware in the hands of the public.
 
Become a Patron!
Back
Top