Alan Wake 2 Was Designed As a “30 FPS Experience,” Remedy Reveals

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,324
Points
83
Alan Wake 2 may be able to run at higher frame rates on those with the proper hardware, but the sequel is meant to be experienced at a lower FPS. That's according to Thomas Puha, Communications Director at Remedy Entertainment, who X'd this morning about how Alan Wake II will feature a Performance Mode on PlayStation 5 and Xbox Series X despite the game having been "built from the beginning as a 30 FPS experience focusing on visuals and ambiance." The PC version of Alan Wake 2 will be available exclusively from the Epic Games Store when it launches on October 27, 2023, possibly with a Steam release coming a year or so later.

See full article...
 
Alan Wake 2 was designed for people who don't care about games.

There should be a standardized approach to system spec generation.

"Minimum Specs" should be the minimum hardware required to achieve 60fps 0.1% minimum framerates at low settings at 1920x1080, because that is the minimum playable.

"Medium specs" should be the minimum hardware required to achieve 60fps 0.1% minimum framerates at medium settings at 2560x1440.

"High specs" should be the minimum hardware required to achieve 60fps 0.1% minimum framerates at high settings at 2560x1440.

"Ultra specs" should be the minimum hardware required to achieve 120fps 0.1% minimum framerates at ultra settings at 2560x1440, or 60fps 0.1% minimum framerates at ultra settings at 3840x2160

Or something like that. (this was an off the top of my head definition, I am open to some revisions)

A game should never be released if it doesn't target and achieve 60fps 0.1% minimum framerates on at least some combination of commonly available hardware and settings.
 
Last edited:
Then I guess I don't care about games.

I don't think there is a single game in history that can guarantee 60fps for 0.1% lows. Not even 1 % lows. Those specs are ludicrous. 1% and especially 0.1% lows are for maniacs to gawk over, they really shouldn't factor into recommended specs as they are so easily affected by measurement errors, there is no way to guarantee them.

I think "minimum specs" should be where the game can run without glitches and problems regardless of performance.

"Minimum Recommended" should be where the game runs at least 30fps average on low settings in 1920x1080.

"Recommended" should be where the game maintains 60fps average on medium-high settings in 2560x1440.

"Enthusiast / 4K specs" should be 60fps average on Ultra settings in 4K

But of course DLSS and FSR throws a giant wrench into the equation.
 
All I know is that even though I'm really into Sci-Fi/horror/X-Files themes I simply got bored with Control, and the same when I retroactively went back to Alan Wake when it became free, so I don't intend to give this one a try. I like the ideas they have, and some of the visuals, but just can't get past the feeling that they are too small and simply rely on a ton of running around in a small environment to tell a larger story via cutscenes instead of more areas. Basically trapped in a box watching videos or listening to audio recordings to simulate a larger experience.
 
I just want to play games and have fun, I don't play with the FPS counter enabled /shrug

I presume this comment was not intended literally, but if you can't tell the difference between 30fps and 60+ FPS without the FPS counter on screen, I'm both a little jealous (because life would be a lot cheaper that way) and a little concerned about your health! :p

If I sat down at my computer today and a game were running at 30fps, I wouldn't even last a second, before exclaiming "yuck, what is wrong? This is unplayable" and starting my troubleshooting.

Anything under 60fps is instantly unplayable (at least for FPS and 3rd person titles where the camera pans with the mouse), no framerate counter needed. "God mode" strategy titles, etc. are a little more forgiving when it comes to framerate (heck I can play Civ 6 on my old laptop at 10-15fps) but in first person the input lag when moving the mouse around reaches out and punches me in the face at 30fps, and it makes the game completely impossible to enjoy.

I think the only reason it is tolerable on consoles is because those thumbsticks are so terribly low sensitivity that people don't notice. They ahve been conditioned to having ****ty controls.

A game could be the perfect game sent by gaming jesus christ himself, but if I can't get a minimum of 60fps (or if I have to use a controller with thumbsticks) I'm just not going to play it under any circumstance. 60fps is simply just the bare minimum to make a title worth playing at all.
 
I grew up playing games at 15FPS and less, because that's all the computers I had were capable of. And considering I beat some of the hardest games that way it can't be called unplayable by definition.

If you'd rather miss out on games than tolerate lower than 60 FPS even momentarily then it is entirely your loss. I don't think I ever played a game with strictly 60+ FPS outside of revisiting retro titles.

It's not that I don't want more FPS, but playing at even <30 FPS is preferable to not playing at all.
 
I grew up playing games at 15FPS and less, because that's all the computers I had were capable of. And considering I beat some of the hardest games that way it can't be called unplayable by definition.

It really depends on the game.

When I played Half Life (at whatever low framerate my Voodoo1 was capable of) it was a difficult game. Some of the jumping challenges to get across areas you were trapped in and solving puzzles were a real and serious challenge. I remember saving and repeateldy retrying some areas.

For ****s and giggles I replayed it a few years ago on modern hardware, and with a decent framerate the game was so stupidly easy it wasn't even funny. I played through it start to finish in only a couple of days, whereas back in the late 90's it took me most of a summer as it was a surprisingly difficult challenge.

Where you draw the line between "my hardware makes things unnecessarily difficult" and "unplayable" is really a personal choice though.

It's not that I don't want more FPS, but playing at even <30 FPS is preferable to not playing at all.

I disagree. I would find the experience so horrible and frustrating that I would rather save the game for another time in the future when I have better hardware. In 2023 (and really any time since ~2001 or so) I would sooner not play any game at all*, than have to put up with sub 60fps framerates, at east in areas of the game that matter.

(*with the exception of strategy games, or something else where framerate is less critical)

As an example, there are some areas that are not key to coordination (no battles, etc.) in Starfield that really challenge most modern CPU's, and drop framerates down to the ~55-60fps range. I won't let that stop me playing it, but mostly because these are areas in which your character just casually strolls around, visits shops, etc. If I had to put up with sub 60fps framerates in action or coordination specific scenes, I'd just save the game for later (even if ti is a few years later), and not cause myself story spoilers while having a subpar experience.

Luckily I can manage 4k Ultra in this game in areas where there is action in the 80-110fps range.
 
There was a time lower framerate was acceptable gameplay. In the beginning, we were happy at 24 FPS, heck you were lucky to get in the 20's, teens were often common, in the beginning. Then the bar moved up to 30 FPS, at [H] we targeted 30 FPS as our "playable" framerate, people were ok with 30 FPS gameplay. Now, it's jumped way up, no longer is it even just 60 FPS, now people want over 100FPS+. The bar has changed, but there was a time games were enjoyable at lower framerates. It has been interesting to see the shift over time.
 
For ****s and giggles I replayed it a few years ago on modern hardware, and with a decent framerate the game was so stupidly easy it wasn't even funny. I played through it start to finish in only a couple of days, whereas back in the late 90's it took me most of a summer as it was a surprisingly difficult challenge.
You don't think you have a higher level of skill and overall game design from then to now. It's a FPS game with puzzles. That's like saying to someone like me that I would suck at FPS games because I've played SO MANY of them years ago.

The truth is once I take the time to find my stride in an modern FPS game I am good at it. Potentially very good... but good. I just don't enjoy them any more.

I'd take a half life over that mess any day. But yea the old half life as it was originally made is a lot 'easier' to play. part of it is hardware sure... but a large part of it is the base skill level I have with that type of game.
 
It really depends on the game.

When I played Half Life (at whatever low framerate my Voodoo1 was capable of) it was a difficult game. Some of the jumping challenges to get across areas you were trapped in and solving puzzles were a real and serious challenge. I remember saving and repeateldy retrying some areas.

For ****s and giggles I replayed it a few years ago on modern hardware, and with a decent framerate the game was so stupidly easy it wasn't even funny. I played through it start to finish in only a couple of days, whereas back in the late 90's it took me most of a summer as it was a surprisingly difficult challenge.
It's more about kids sucking at videogames. I re-played lots of games that seemed impossibly hard as a kid only to breeze through them later without much if any challenge. And many of these games were 8bit 2D games where performance was a non-issue.
Where you draw the line between "my hardware makes things unnecessarily difficult" and "unplayable" is really a personal choice though.
It might be less fun than intended, but I still rather have those experiences than miss out on all the games that ran poorly on my hardware at the time.
I disagree. I would find the experience so horrible and frustrating that I would rather save the game for another time in the future when I have better hardware. In 2023 (and really any time since ~2001 or so) I would sooner not play any game at all*, than have to put up with sub 60fps framerates, at east in areas of the game that matter.
The thing is that back in the 90s and early 2000s, I either used what I had or didn't play. Things moved so fast in the early 3D era that you blinked an games became horribly outdated overnight, no way I would've played Shadows of the Empire if I didn't do it then and there.

Now I have the luxury of just upgrading my hardware if the performance becomes frustrating. But I think a game averaging 30FPS is still within the realm of perfectly playable. Going from 25 to 30 FPS improves the experience greatly, while I see going from 30 to 60 FPS as a minor improvement, but in most games it is totally inconsequential. Case in point I'm now playing Starfield roughly maintaining 30 FPS, and there wasn't a single moment where I felt performance was hurting my experience.
(*with the exception of strategy games, or something else where framerate is less critical)
That's funny actually because I find low performance extremely annoying in top down and point and click strategy games. As when you move the entire screen pans as one, and if it is not smooth it becomes hard to click on the units you wanted to. Plus I think there is no excuse for a strategy game to run poorly.
 
It's more about kids sucking at videogames. I re-played lots of games that seemed impossibly hard as a kid only to breeze through them later without much if any challenge.
I've had at least several situations like that, and it's always surprised me. Games I couldn't hope to beat as a child I can now do without much effort. It's really weird.
 
I totally agree with the points that Brent made about how expectations have changed over the years.

These days I can cope with 50-60 FPS but prefer a solid 60 when playing games in the cave on the Z9D. For the CRG9 and C2, I prefer 90-110. Anything over these #s is overkill for me.
 
I presume this comment was not intended literally, but if you can't tell the difference between 30fps and 60+ FPS without the FPS counter on screen, I'm both a little jealous (because life would be a lot cheaper that way) and a little concerned about your health! :p
I will be honest - most of the time I can't.

If I have a side by side, yeah, I think the higher refresh rate feels better and plays smoother. But honestly, until framerate dips down under 30, I don't usually notice it at all (unless it's a big hitch downward).

I'll happily play at 30fps on my Steam Deck all day long, and it's never bothered me on the consoles either.

Now, back in the old CRT days, 60Hz used to give me a god awful headache if there was also fluorescent lighting in the room, and if I didn't have a monitor that supported 75Hz I'd have to drop it down to something lower. Thank god LCD kinda did away with that.
 
I miss the days where games came out where you basically needed next gen hardware to play them. I enjoyed Crysis being the meme and then coming back to it after upgrading. I believe I did the same thing with Unreal Tournament as well, but that was long enough ago I don't remember if you needed next gen hardware to play it, or if I just couldn't afford the top end hardware to actually play it when it game out :ROFLMAO:

100% give me a game that barely runs on my 3090ti to give me an excuse to buy a 5090 when they come out so I can come back to the game and be just WOW'd at how much better the game looks on the brand new hardware.
 
Yea I gotta say I ugpraded my 6800xt to a 7900xtx. It doesn't feel night and day... this is similar to what happened when I put a 1070 on a 2600k CPU. When I upgraded the CPU/ram the leap in framerate was literally double. I'm hoping that the next gen CPU I invariably end up moving this video card to really unlocks it.
 
Now, back in the old CRT days, 60Hz used to give me a god awful headache if there was also fluorescent lighting in the room, and if I didn't have a monitor that supported 75Hz I'd have to drop it down to something lower. Thank god LCD kinda did away with that.
60Hz on CRTs gave me severe headaches, and caused hella eye strain and eye fatigue. It really f*cked with my eyes something serious. I had to use a minimum of 75Hz, but I was usually around 85Hz. Yeah, thankfully LCD did away with all that.
 
Now, back in the old CRT days, 60Hz used to give me a god awful headache if there was also fluorescent lighting in the room, and if I didn't have a monitor that supported 75Hz I'd have to drop it down to something lower. Thank god LCD kinda did away with that.
Refresh rate on CRT is very different than frame rate. The strobing will absolutely cause a headache. I could not stand anything bellow 85Hz, and 100Hz was my happy place. You also had to keep it in mind that some monitors while supported high refresh rates they didn't have the bandwith to actually display them without making the screen blurry.
 
Refresh rate on CRT is very different than frame rate. The strobing will absolutely cause a headache. I could not stand anything bellow 85Hz, and 100Hz was my happy place. You also had to keep it in mind that some monitors while supported high refresh rates they didn't have the bandwith to actually display them without making the screen blurry.

Yep. A CRT actually updates the entire screen every time, and 60hz is too slow for it to be smooth, so it looks like it is flickering, at least on a desktop with white windows.

It could look good in motion on darker screens though. Most of us switched to 60hz and vsync for games (to avoid tearing) and switched back to 75+ Hz for the desktop.

LCD's only update the part of the screen, so for them, the flicker is not a problem.
 
LCD also "updates" the whole screen regardless of the content changing.

The main difference is that an LCD has constant brightness, while a CRT does not. On a CRT the brightness is achieved by an electron beam bombarding phosphorescent pixels causing them to glow, but they fade with time and the more time it takes for the same pixel to be zapped again the more time it has to dim between passes. And the brighter a pixel is, the more noticeable the fade is.
 
Become a Patron!
Back
Top