DLSS 3 Accelerates Performance In S.T.A.L.K.E.R. 2: Heart of Chornobyl by 2.4X on GeForce RTX 40 Series GPUs at 4K with All Settings Maxed

Which begs the question, what the hell is resulting in this title being so insanely CPU heavy?

Playing at 4k, I'm used to always being GPU limited, never being CPU limited. This has given my CPU's long lifespans.

I had read that UE5 was better at multithreading, and maybe that is why we don't see the traditional one or two cores close to being maxed out, and the rest with low loads. Additionally DX12 spreads the draw calls across CPU's, so that might further push things in this direction.

But even so, the lowest loaded cores are at like 35%. That can't be all DX12 draw calls, though the broad distribution across cores really does point in that direction....

Maybe they designed the game to use way too many simultaneous draw calls?

I can't remember the source now, but I remember reading (probably a news story from the HardOCP front page years ago) that many developers who develop for consoles, where they can have nearly unlimited simultaneous draw calls due to API's that are much closer to the hardware struggle when porting to PC as draw calls on the PC platform have much greater CPU overhead due to needing an abstraction layer to deal with the fact that the GPU is not homogenous as it is when developing for a console. This is apparently especially bad with Direct 3D. Less so with the likes of Vulkan.

While I can't find the orignal story now, Google landed me on this post on the Tripwire forums (developer of the Red Orchestra series) which discusses the issue.

GSC Game world has always been a PC first developer though, so I can't imagine that is their problem, but they are new(ish) to Unreal Engine. Maybe they just programmed the title to do way too many simultaneous draw calls, and are having a terrible time scaling that back after the fact?

Or then again, maybe they are just mining ethereum on everyones CPU while playing the game 😅

I'd like to believe that CPU efficiency can be patched down the road, but I have read that if draw calls are the issue, this can require restructuring the entire game to reduce the needs for simultaneous draw calls.

Maybe porting it to use Vulkan instead of DX12 would help, as I understand this is much less of an issue with Vulkan than with DX12.

But what do I know... I'm just grasping at straws here based on articles I have read over the years. I am way outside of my area of expertise.


But forum posts like this seem to reinforce that keeping draw calls under control can be difficult for a team that is new to Unreal Engine... As Kyle used to say, maybe I am on to something, or maybe I am on something, but I suspect inflated draw calls due to the developer not being familial with unreal engine are part of the issue here.
 
Last edited:
I know that ever since the first Horizon game I've been seeing more games finally making use of extra cores/threads and it's the main reason I'm leaning towards a 9900/9950 X3D. Seems too soon to need more than 8/16 but this is happening more often and not just with sim-type or strategy games where I'd expect it to. Clock speeds help but only so far.
 
Sorry, maybe I am being slow today, but I can't place "DF", would tyou mind de-abbreviating for me?
Ah, my apologies. Digital Foundry.

The game is so close to being playable like this, that I am tempted to just do it, in spite of this not being the ideal experience.
I think I saw that they are getting their first patch out next week. No idea what fixes it contains. But you know the deal, the more you wait, the better the game turns out, and the more stable and polished it is. Cuz games don't ship finished. But I certainly wouldn't argue against immediately jumping into a game you've waited many years for. Me, I'm extremely used to not playing games at launch, even games I know I wanna play. I take my sweet time to get around to every game, and by then there's been updates and fixes, discount prices from sales, and sometimes I'm on better hardware too.

This has been one of my very top game series, and I have been waiting for this one for a long time. The call of the zone is strong. My experience will be WAY better with a motherboard/CPU upgrade though, so I am probably going to wait.
I've done that several times for games. For a game you've been anticipating, you want your first-time experience through the game to be the best (as often-times the first experience is the most memorable one), so you don't wanna do it on your current hardware if it's holding the game back somehow. Like when CP2077 came out, disregarding all the issues the game had in its early life, just on performance and tech alone I was not about to run that game on the Haswell-E and 1080 Ti PC I was on at the time. I mean I did try it, but of course it wasn't an ideal experience, far from it. I still haven't gotten around to it yet, and look at all the sh1t they've added to the game, and ways they improved the game, since launch.

Which begs the question, what the hell is resulting in this title being so insanely CPU heavy?
Very good question, I'd like to know too. The guy who did the tech review for DF said "I think the CPU-related performance issues are due to streaming data during world traversal", as well as things like NPCs.

He went on to say "Running towards the first town on a high-end Ryzen 7 7800X3D - the second-fastest gaming CPU - there are frame-time spikes in the 20ms and 30ms range - and these were almost twice as bad in the pre-release version of the game. ...And this is for an area of the game without enemies or other NPCs, which add their own CPU burdens.

When you arrive at Zallisya and there are NPCs, CPU performance becomes erratic even on the 7800X3D, with frame-time spikes from 30ms to 50ms. Frame-rates drop momentarily into the 40s, stabilising to around a 60fps average - unbelievably slow for a CPU of this calibre. On the Ryzen 3600, the frame-rate is just over 30fps, with initial frame-time spikes of 50-80ms that smooth out to around 50ms - still quite poor! This isn't a one-off encounter either, as any area with 20 NPCs or more incurs a similar performance penalty. I once hid in a cellar during a storm, and when all of the NPCs in the area did the same, frame-rates on the 7800X3D dropped to 40fps. Suffice it to say, this game is extremely CPU-limited."

That was quoted from the article. In the video, at 13:55 to 16:35 is where talks about and shows examples of CPU performance issues. It's worth looking part of what comes after that too, about the CPU performance on consoles, just cuz of the hilarity, with the game just completely freezing for several seconds at times! He showed a 6-second freeze at one point!

Anyways I suspect data streaming issues and NPCs aren't the whole picture with the the CPU performance in this game.

Or then again, maybe they are just mining ethereum on everyones CPU while playing the game 😅
HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA
Hey man, after all the sh1t they've been through to get this game out the door, they deserve a little extra!

But forum posts like this seem to reinforce that keeping draw calls under control can be difficult for a team that is new to Unreal Engine... As Kyle used to say, maybe I am on to something, or maybe I am on something, but I suspect inflated draw calls due to the developer not being familial with unreal engine are part of the issue here.
Very interesting. I hadn't heard about this issue with UE5, but then again I've not heard much about UE5 in general, and I'm not even sure I've ever played any games that use it.

I know that ever since the first Horizon game I've been seeing more games finally making use of extra cores/threads
You should see idTech 6 and especially idTech 7's CPU usage.
 
Last edited:
I know that ever since the first Horizon game I've been seeing more games finally making use of extra cores/threads and it's the main reason I'm leaning towards a 9900/9950 X3D. Seems too soon to need more than 8/16 but this is happening more often and not just with sim-type or strategy games where I'd expect it to. Clock speeds help but only so far.

I'm with you, but at the same time, using CPU cores and actually benefiting from their work are two different things.

Nothing in this game stands out to me as being so significantly different from anything that came before it to justify this kind of CPU utilization, which leads me to believe we are dealing with bugs and a lack of optimization...

I want to not just see load spread out over more cores. I also want to see some benefit from it :p

And maybe there is some new form of AI in the NPC's that is resulting in this. The limited gun-play I got into thus far in the game (prologue only, I'm going to limit any spoilers beyond that) but the gun play was a little better than I remember previous single player games being, in part because the NPC's seemed to be more dynamic in their use of the environment and cover. So maybe this explains it?

And I know, I know. "unopitimized" gets thrown around by gamers as a scapegoat for their ****ty hardware whenever their decade old GPU can't hack it all the time. The term has been misused for so long as to almost have lost its meaning. In this case I suspect a lack of optimization of draw calls may actually be the real cause of the performance issues though.


He went on to say "Running towards the first town on a high-end Ryzen 7 7800X3D - the second-fastest gaming CPU - there are frame-time spikes in the 20ms and 30ms range - and these were almost twice as bad in the pre-release version of the game. ...And this is for an area of the game without enemies or other NPCs, which add their own CPU burdens.
You should see idTech 6 and especially idTech 7's CPU usage.

That's interesting.

I wasn't monitoring frame times when playing (just the FPS chart) but I didn't notice serious frame time spikes qualitatively when getting there (that's where I stopped playing after my testing)

Maybe that is because my framerate was already so bad that it didn't stand out.

Or maybe something else explains this.

I had the game installed on my new (to me) Optane DC p5800x

I wonder if it is reading from disk live during these transitions, causing the CPU to enter a wait state while data is pulled into RAM, and the amazing low queue depth random performance of the p5800x helps mitigate that.
 
Just a heads up for folks thinking of getting this, it is available on GOG.

Well, ****

I bought it on Steam out of habit.


Maybe I'll just return it and buy it on GoG...

So I did it.

Citing the issues I have had thus far, and the underwhelming performance, (which are real, and not just made up to get a Steam refund) I requested a refund on Steam.

I'll probably re-buy it on GoG when I buy it.

Best of both worlds I figure.

I'll show up in their revenue stats as a refund request due to game issues, adding my little incentive to the pile for them to continue patching the title, while at the same time they will be getting my money from GoG.

I actually had to install GoG Galaxy. Turns out it has been so long since I used Gog I didn't even have their client installed, and didn't even realize. I really need to get into the habit of trying to buy on GoG. They really are the "good guys" of digital game distribution. I hope they survive.

I wonder when the last time I did a clean Windows install was. It has probably been a while.
 
I actually had to install GoG Galaxy.
That thing has been updating a lot over the last year or so. I swear about once a month at least. Still love GOG though.

I wonder when the last time I did a clean Windows install was. It has probably been a while.
Me too. I just get tired of reinstalling launchers, drivers, etc. Not to mention revalidating games and so on.

Edit: On the flipside I did update the BIOS and Chipset drivers on both desktops this weekend. Can't say I gained FPS but both seem a bit more responsive.
 
Nothing in this game stands out to me as being so significantly different from anything that came before it to justify this kind of CPU utilization, which leads me to believe we are dealing with bugs and a lack of optimization...
This is my thought process with this game as well. Nothing is close to finished/optimized in the PC gaming genre when it is released anymore. Any half way decent game I have played that is close to optimized has played very well on my 7800X3D and now the 9800X3D. The 9800X3D is much more responsive in daily tasks than the 7800X3D was and that was really my sole purpose on upgrading.
 
I'm with you, but at the same time, using CPU cores and actually benefiting from their work are two different things.
...
I want to not just see load spread out over more cores. I also want to see some benefit from it :p
I'm with you 100% there. The problem is, this will almost never happen in the modern game industry, so long as consoles are still the lead development platform for the overwhelming majority of studios. And consoles have been on 8 cores for two generations now (and technically they only have access to 7 cores since 1 core is reserved for OS stuff and background tasks and sh1t). Consoles are the lowest common denominator, and they often hold back game development. There are only a scant few devs left who code for PC first and then port down to consoles. And even then, they know they're gonna put their games on consoles eventually, so I'm sure that factors into design decisions.
 
I'm with you 100% there. The problem is, this will almost never happen in the modern game industry, so long as consoles are still the lead development platform for the overwhelming majority of studios. And consoles have been on 8 cores for two generations now (and technically they only have access to 7 cores since 1 core is reserved for OS stuff and background tasks and sh1t). Consoles are the lowest common denominator, and they often hold back game development. There are only a scant few devs left who code for PC first and then port down to consoles. And even then, they know they're gonna put their games on consoles eventually, so I'm sure that factors into design decisions.

I think there is some slight benefit to using more cores on the PC than on consoles, that is relatively easily achieved (because DX 12 does it for you) and that is the aforementioned "draw call" issue.

Consoles have a much "closer to hardware" API, as a given model of console has the same hardware in every unit, so they don't need a big lumbering API like Direct 3D.

This results in consoles easily being able to use vary large numbers of draw calls for things without much of a drawback as there is such little overhead for each draw call.

On the PC - however - each one of those draw calls has to go through the cumbersome Direct 3D API which - together with your GPU driver - translates the call into something the GPU understands. Each draw call doesn't use a ton of CPU power, but it definitely adds up.

Hardware T&L as implemented back in the GeFroce 256 days with DirectX 7 (if memory serves) was supposed to solve the issue of CPU load from draw calls, but it never completely eliminated the load, and when you use lots of draw calls in modern complex games, it still adds up.

Enter DirectX 12. It automatically and on its own spreads draw calls over all of your CPU cores (and virtual cores with SMT). This can have some issues (cache thrashing, colliding with main game threads, inadvertently using "E" cores slowing things down, inefficiently using SMT cores, etc.) but in general unless you are running DX12 on a system with a small number of very performant cores, DX12 performs better than DX11 where the draw calls were not as threaded.

I think that on systems with very large numbers of cores DX12 could probably do a better job of running those draw calls on cores that are near idle, so they don't collide with main game threads and thrash those game threads cache, but other than that it is a good implementation.

If the scheduler and game were more intelligent, they could identify the handful of cores of best quality that can "boost" the most, send the CPU limiting main game threads to those cores, and dedicate those cores to only those threads, and allowing DX12 to send draw calls to the remaining underutilized cores.

Heck, for all I know they already try this between the MS Game Bar and other scheduler optimizations, but my confidence in the Microsoft scheduler team hasn't exactly been boosted by their recent dumbass failures (like the Zen cache "bug") so I don't know for sure.

The Linux scheduler is brilliant at this stuff, to the point where you in some limited cases actually get better Windows CPU performance in KVM under Linux, as the combination of KVM and the Linux kernel scheduler are better able to place the right threads on the right cores than Windows itself is.

This could give you a good benefit from having additional cores beyond the 8 on consoles with restively little work on the side of the game developer.

That, and it can help lower the cost of porting a game that was developed with little concern for the PC to the PC platform.

I know many titles are co-developed upfront, but if you don't do this and develop a game for consoles without any consideration for PC, it is easy to just not think about the number of draw calls you are creating making it really difficult to port so that it runs well on the PC. You have to go back through every scene and re-optimize the draw calls issued to draw every object. This can be a huge amount of work that requires expensive technically qualified people.

Even with DX12 spreading out the draw calls to all of your cores, the porting process from the console is likely to involve the need to do some draw call reduction (if this was not considered in initial development) but DX12 and many cores can help reduce that burden.

Now I know the whole debate about "lazy developers", etc. etc. but the truth is this. While the PC platform has the draw power to absolutely demolish console hardware, the higher level API's mean you have to write the title with those API's in mind, or it will run poorly. Writing a game without any consideration for draw call count optimization is certainly much easier than painstakingly optimizing every scene and object to reduce and combine draw calls though. Developing games is already a rather expensive endeavor, with many titles being canceled because they are unlikely to sell enough to pay for the development.

If it helps reduce the necessary workload, and thus making the games cheaper to produce and/or port, that means we get more titles, and the more the merrier.

I am still disappointed that the planned Deus Ex titles following Mankind Divided were canceled. The cost pressures are real, and can lead us to miss out.
 
By the way, is it normal for this game to compile shaders every time you load it?

Or maybe that was just because I upgraded my drivers in between?
 
By the way, is it normal for this game to compile shaders every time you load it?

Or maybe that was just because I upgraded my drivers in between?
That should happen on first launches, driver updates or switching API's (Vulcan to Direct X 12 or 11, or switching Direct X versions.)
 
Consoles have a much "closer to hardware" API, as a given model of console has the same hardware in every unit, so they don't need a big lumbering API like Direct 3D.
I'm with you on pretty much all of your post, but some info:

Xbox One (the 3rd Xbox, which was for the 8th console generation) and Xbox Series X|S (the 9th-gen Xboxes) run on Windows and use DirectX. Xbox One X (the higher model of the XB1) was closely developed along the DX12 specification, and even had some specialized hardware to accelerate certain D3D12 instructions. XB1 launched with Win8 and updated to Win10 later. XBSX|S are still on Win10. The DirectStorage API used on Win10 and Win11 was introduced with these consoles.

Original Xbox was the "DirectXbox" but it didn't have the full DX8 API and had a lot of custom stuff in there too, if memory serves correct. Xbox 360 I think was very custom, but I recall it first had components that were later added to DirectX. This is where some parts of DirectX like XACT, XAudio, and Xinput come from.

Direct3D 12 and Vulkan were both inspired by AMD's Mantle, their low-level rendering API for AMD Jaguar hardware (PS4 and XB1), and Vulkan contains Mantle, which was donated and used to form the initial core of Vulkan.

In general you are correct though, and in the past, before consoles started moving to high-level languages like C and C++ around 5th-gen (PS1/Saturn/N64), all console programming was done via assembly. Some companies like Team Ninja were still coding in assembly as late as the 6th-gen systems (original Xbox, which ran on Win2K kernel). Can't get any lower-level than that (well unless you are superhuman and write instructions directly in binary or machine code). But yeah even in modern times consoles enjoy lower overhead. And while there now exist multiple versions of the same console (ie base PS4 and PS4 Pro), it's usually two at most, so still not a whole lotta different hardware variations developers have to take into account (although devs have been wrestling with the limitations of the XBSS). Not to mention how PS4 and XB1, and PS5 and XBSX, are so dang similar.

Sony still maintains their own rendering API because they don't want to use Microsoft's sh1t.

Heck, for all I know they already try this between the MS Game Bar and other scheduler optimizations, but my confidence in the Microsoft scheduler team hasn't exactly been boosted by their recent dumbass failures (like the Zen cache "bug") so I don't know for sure.
I've heard both AMD and Intel express great dissatisfaction with Microsoft's thread scheduler.

The Linux scheduler is brilliant at this stuff, to the point where you in some limited cases actually get better Windows CPU performance in KVM under Linux, as the combination of KVM and the Linux kernel scheduler are better able to place the right threads on the right cores than Windows itself is.
I've experienced this myself, and yeah it's pretty awesome.

By the way, is it normal for this game to compile shaders every time you load it?
That should happen on first launches, driver updates or switching API's (Vulcan to Direct X 12 or 11, or switching Direct X versions.)
And also when the game has been been patched. But yeah it's not supposed to be happening every dang time you launch.

Or maybe that was just because I upgraded my drivers in between?
Yeah if you updated your drivers and then ran the game, it's gonna compile the shaders again.

I am still disappointed that the planned Deus Ex titles following Mankind Divided were canceled.
Same.

That, and it can help lower the cost of porting a game that was developed with little concern for the PC to the PC platform.
All of this started with the 8th-gen consoles, since they moved to x86-64/AMD64. Now with consoles having PC hardware (and some using PC OSes and PC APIs), even if a dev didn't care about porting their game to PC, technically the game is a "PC game" anyways, and that definitely helped with the crappy ports to PC sucking just a little bit less. And yeah, overall lower cost and easier porting process. The AMD Jaguar SoC used in 8th-gen consoles was some sh1t, but I'm glad the 9th-gen consoles used more serious hardware with Zen 2 and RDNA2.

If it helps reduce the necessary workload, and thus making the games cheaper to produce and/or port, that means we get more titles, and the more the merrier.
Which is good for me, cuz while I've had consoles since 2nd-gen, PC is ALWAYS my preferred place to play games, and I'm hoping that starting with 9th-gen I won't end up with any more consoles (although to be fair, every single 8th-gen console I have was a gift, I did not pay for them myself). I don't believe in exclusives for any platform. Well that's not true, I believe in PC-exclusives. I remember a time when consoles didn't have the hardware or the input devices for properly playing a lot of PC games, so they just had to do without.
 
So I did it.

Citing the issues I have had thus far, and the underwhelming performance, (which are real, and not just made up to get a Steam refund) I requested a refund on Steam.

I'll probably re-buy it on GoG when I buy it.

Best of both worlds I figure.

I'll show up in their revenue stats as a refund request due to game issues, adding my little incentive to the pile for them to continue patching the title, while at the same time they will be getting my money from GoG.

I actually had to install GoG Galaxy. Turns out it has been so long since I used Gog I didn't even have their client installed, and didn't even realize. I really need to get into the habit of trying to buy on GoG. They really are the "good guys" of digital game distribution. I hope they survive.

I wonder when the last time I did a clean Windows install was. It has probably been a while.

Heh,

I tried buying Stalker again on GoG, but that is just refusing to work.

I just get a spinning "please wait" notification after I authenticate with paypal.

I suspect something is being blocked by my pihole.

At first I thought it was paypal domains as I spotted the following in my block history:

b.stats.paypal.com
t.paypal.com

But this doesn't really make sense, because paypal works everywhere else. This only happens in Gog.

So I tried whitelisting the paypal domains, and still nothing. Then I blocked them again, as that means they probably are responsible for some sort of tracking and/or ads and not just a false positive.

I also noticed this domain showing up in my blocks every time I tried to make a purchase:

client-analytics.braintreegateway.com

Now I suspect this is the culprit.

I really wanted to support GoG, but I value not being tracked more, so I think I am just going to give up, and buy it again on Steam when I am ready to play it.
 
Bummer, I don't remember using PayPal with GoG so I haven't encountered this.
I had to drop Paypal entirely when, after not having used it for a few years - a random Chinese purchase shows up for a few hundred dollars - no idea what it was, it was all in Chinese characters. I refute the claim with Paypal... and they deny me, saying it's "in line with my recent purchase history".

I hadn't purchased anything there in years, so --- good thing I didn't have it linked to my bank account and only my credit card. So I disputed it via my credit card. Funny thing, after you dispute Paypal charges with your credit card and succeed they cancel your account.
 
Become a Patron!
Back
Top