DOOM: The Dark Ages Early Benchmarks Indicate a Great-Looking Game That Performs Well at 1080p/1440p but Needs a Powerful GPU for 4K

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
9,507
Points
113
DOOM: The Dark Ages arrives next week, and it's looking like most folks will be able to experience high frame rates using its Ultra Nightmare preset when gaming at 1080p or 1440p.

See full article...
 
We are going backwards. 1080p was the peasant setting even 10+ years ago.

The current arrangement is too comfortable for AAA developers and video card manufacturers. Devs can use computationally expensive features that barely improve graphics if not outright making it worse, while nvidia sells their "nextgen" hardware with no real innovation or leap, so you can play a game that looks no better than the games from 5 years ago.
 
I know, but sadly, that is the state of things. On the plus side, at least textures and effects are making strides so that modern 1080p doesn't look as awful as it did back then. Just for kicks, I've been doing a lot of testing with newer games, with every IQ setting maxed, and then some if good mods are available to increase further, at 1080p, and I admit I'm impressed at how well it can natively be. For a while now, I've wondered why so many still game at it, beyond stupidly high FPS for competitive stuff, but it still has some merits. However, I ultimately still aim for native 4K when possible, especially if able to use DLAA.
 
DOOM: The Dark Ages arrives next week, and it's looking like most folks will be able to experience high frame rates using its Ultra Nightmare preset when gaming at 1080p or 1440p.

See full article...
I hadn't seen the reviews you posted yet, but I saw this one:
At 4:56 to 7:02 he shows side-by-side differences between all the graphics presets, and well as he points out, there's almost no f*cking difference. And he says no it's not the YT compression, it's just that the game really doesn't have many differences between the settings at all. Is that a bug? Is that how the game was designed? I wonder what's up.

4080 performing better than 5080 at 1080p, and exactly the same at 1440p and 4K? Sounds like a GPU driver issue or game needs to be patched.

Radeon 9070 XT beating the RTX 5080 and 4080 Super, and matching the Radeon 7900 XTX at both 1080p and 1440p. At 4K, the 7900 XTX, RTX 5080, 4080 Super, and 4080 were all performing about the same, with the Radeons leading. None of them were above 60fps at 4K (this is without upscaling). The only two cards that could do above 60fps at 4K are the 4090 and 5090.

9070 XT using FSR vs RTX 5080 using DLSS, the Radeon comes out ahead.

At 14:25 he looks at the 5060 Ti at 4K (DLSS Balanced) and 1440p (DLSS Quality), 8GB vs 16GB. Amazing how the same exact card gets crippled so terribly in the 8GB variant. Lowering the texture pool size setting from the default 2GB to just 1.5GB helps out 8GB video cards a lot. Also DLSS Frame Generation didn't really work with the texture pool size set to the default 2GB. It was kinda buggy on the 8GB 5060 Ti, but worked fine on the 16GB one.

He says cuz so many gamers still have 8GB cards, these cards are holding back PC gaming. He said developers have to make a choice between supporting the vRAM configuration that most people have, or leaving those cards behind to push visuals and features to modern standards (which then f*cks over owners of 8GB cards). Sounds like id Software tried to accommodate them as best as they could.

I think it's also important to point out how according to the Steam Hardware Survey, most gamers on Steam do NOT have a graphics card with RT hardware, so I wonder if it's really right for devs to require it. First Indiana Jones and the Great Circle required RT, and now this game. To be fair, you shouldn't have a PC that is weaker than the current-gen consoles if you are a gamer. But that's not a reality for most people.

In other news, a friend randomly gifted me the game, so I guess I'll get to try it out for myself.
 
I think it's also important to point out how according to the Steam Hardware Survey, most gamers on Steam do NOT have a graphics card with RT hardware, so I wonder if it's really right for devs to require it. First Indiana Jones and the Great Circle required RT, and now this game. To be fair, you shouldn't have a PC that is weaker than the current-gen consoles if you are a gamer. But that's not a reality for most people.
RT cards have been out since 2018.

Most people that have non-RT GPUs are running esports or indie games as current AAA games run like crap on decade old hardware unless you want 1080p low (probably with FSR) to hit 60fps.

Next gen console ports will all pretty much require RT GPUs and those are coming in a couple years (not including 2d/indie games).

I would be surprised if GTA6 doesn't require an RT capable GPU.

I don't think people remember the 2000s when you almost had to constantly update your GPU to play the latest and greatest or be fine with 20-30fps. Current GPUs are so much better than that.
 
4080 performing better than 5080 at 1080p, and exactly the same at 1440p and 4K? Sounds like a GPU driver issue or game needs to be patched.
That's expected when no fakery is in place. In raw performance the 5xxx series is a sidestep at best.
 
I guess I'll find out in a few days how good the 5090 does in 4K with this game. Glad to see the positive reviews, and hopefully it's as good as the reviewers say it is.
 
I don't think people remember the 2000s when you almost had to constantly update your GPU to play the latest and greatest or be fine with 20-30fps. Current GPUs are so much better than that.

Yes, but they're better because the hardware limited the software more. The majority of gamers have been stuck with 8GB for 8-10 years now. If I'm stuck spending $400 for a top of the line 1080p or entry level 1440p gpu, I want it to actually be capable and last awhile. That isn't even really the case now with the 5060ti. It isn't much of a RT card at 1080p let alone 1440p. Hardware companies going back to 2 year near required upgrade cycles is not going to sit well in this environment. Not after my 8gb card has been quite good for my needs for the past 5 years. That will just limit the games and graphics unless these companies want to sell fewer copies.

The great irony to me is that the hardware companies and the game software companies are doing a great job of actually killing the gaming industry, leading to another market crash with idiotic hardware and bad game value for money.
 
I don't think people remember the 2000s when you almost had to constantly update your GPU to play the latest and greatest or be fine with 20-30fps. Current GPUs are so much better than that.
Yes, but in the 2000s, you could upgrade your PC every 8-10 months just by saving up your lunch money. Now it would cost me 3 to 4 months* salary just to buy a 5090. Things are very much not the same. People are getting priced out of the industry, there will be no PC enthusiasts soon if this keeps up.

*Just to drill it in how dire the situation is: I don't mean 3-4 months worth of saving, but literally the whole money I get from my job that pays above average for my region.
 
People are getting priced out of the industry, there will be no PC enthusiasts soon if this keeps up.
I agree. I have spent too much on my setups lately, and though I keep saying this will be it for me I am sticking to my guns now. I wanted to build the fastest PC I could get now that should last me until I officially hang up the whole gaming hobby altogether. Retirement is getting closer also.
 
Become a Patron!
Back
Top