Battlefield 2042 Open Beta Requires 16 GB of VRAM at 8K

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
8,893
Points
113
battlefield-2042-key-art-logo-1024x576.jpg
Image: DICE



Early benchmarks have shown Battlefield 2042’s open beta requiring 16 GB of VRAM at 8K. A GeForce RTX 3090 and multiple Radeon cards were used in the tests. Up to 13 GB of VRAM was used at 8K in the Medium preset. The GeForce RTX 3090 manages an average of around 30 FPS at this preset. At Ultra, the frame buffer was filled with another 2 GB, bringing AMD cards to their 16 GB limit. At that point, none of the cards could go beyond 28 FPS.



The final game will have DLSS. DLSS could substantially improve the framerate on the GeForce RTX 3090, possibly bringing it closer to 60 FPS at medium. The GeForce RTX 3080 Ti was not used here, but similar results could be expected because of how it compares to the GeForce RTX 3090. There has been no word if FSR will be added to the game, which would also improve performance at 8K...

Continue reading...


 
Yea I'm still happy with 1440p for my computer and looking at 4k for the living room tv. 8k gaming is useless to me.

Plus when the HECK will people understand that DLSS means you are NOT GAMING AT 4K or 8K. You might as well advertise and review the performance at the ACTUAL RESOLUTION THAT YOU ARE GAMING AT.

I know that's awful yelly and I apologize, it's just better upscaling.

woooooooooooohooooooooooo.... <-sarcastically impressed


I mean am I wrong here?
 
8K here we come! Said no one ever.

Yeah, 8K rendering isn't completely useless, but its uses sure are pretty limited.

Estimates of the human eyes resolution range from approximately 0.39 to 0.59 arc minutes per line pair depending on the study. At that resolution you are going to outresolve the human eye, unless you are really close, but if you are that close, the whole screen won't fit in your field of view, so you'll be looking at just a small section of the screen.

So with 8k:

Have al screen size and distance combination that fills your field of view = no improvement in perception of quality over 4k.

Have a screen size and distance combination where large portions of the screen fall OUTSIDE your field of view = may see an improvement, but you'll be losing lots of screen real-estate, so what is the point?

I'm sure 8k will hit at some point, and people will convince themselves that they just HAVE to have it even though there is no perceptible benefit, because that's how these things seem to go but in real terms, it's a waste of money, electricity, you name it.

I see two potential benefits to 8k:

1.) Large size screens where people only use a small portion of the screen at a time, like in a conference room where you may be looking at working on a small section of it, not looking at the rest.

2.) As a form of anti-aliasing. We know DSR looks good. There is no better antialiasing. it's pretty **** computationally expensive though, and there are probably better ways to do it.
 
8K here we come! Said no one ever.
It's actually the first resolution jump in over a decade that I have no interest in. 1080p-sure, 1440p-I was all about it, 4K-I only just managed to get all my equipment synced up for it and have no desire for anything else right now. In ten years, maybe but by then I might be retired and no longer care anything about upgrades at all.
 
It's actually the first resolution jump in over a decade that I have no interest in. 1080p-sure, 1440p-I was all about it, 4K-I only just managed to get all my equipment synced up for it and have no desire for anything else right now. In ten years, maybe but by then I might be retired and no longer care anything about upgrades at all.
I think I'm with you here. I liked the jump from 1080 to 4k, although for gaming purposes it was meh. I don't see any reason to jump to 8k really though.
 
Become a Patron!
Back
Top