Dead Space Gets Scary CPU Requirements: AMD Ryzen 5 5600X or 11th Gen Core i5-11600K Recommended

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,615
Points
113
EA has shared a partial list of system requirements for Motive's new Dead Space remake on Steam, and they have been described by some as "scary."

Go to post
 
I cannot wait for this game! It'll definitely be a day 1 buy for me. I love what I'm seeing so far from these videos. I know it's "not representative of final release" but it still looks gorgeous. Looks like they've stayed really close to the original in terms of looks and atmosphere.
 
I wouldn’t call that scary for requirements. Scary would be a recommended system with a Ryzen 7900X w/ 64gb ram and 1TB nvme storage. A last gen low end part isn’t really a big deal.
 
The recommended CPUs are mainstream parts and outclass what the majority of PC gamers own, so I'm not surprised the requirements have been described as "scary". What other games have similar requirements?
 
Last edited:
I wouldn’t call that scary for requirements. Scary would be a recommended system with a Ryzen 7900X w/ 64gb ram and 1TB nvme storage. A last gen low end part isn’t really a big deal.
Yeah I wouldn't call it scary either. I'm sure my 3700x and 3070 Strix will do just fine for my humble 1440 setup.
 
Specs don't seem scary at all. By the time the game launches the recommended specs will be mid range last gen AMD hardware, and 2 gen old mid range Intel hardware. This doesn't stand out at all.

It feels too soon for a remake of this game. I mean, it only just came out in 2008, which in the grand scheme of things was like yesterday.

Either way, this game won't be one I will be playing. I haven't played any in the series and won't be touching this one either. I don't do 3rd person. I almost quit Far Cry 6 when it briefly went 3rd person in camps. I simply do not do that ****.
 
Specs don't seem scary at all. By the time the game launches the recommended specs will be mid range last gen AMD hardware, and 2 gen old mid range Intel hardware. This doesn't stand out at all.

It feels too soon for a remake of this game. I mean, it only just came out in 2008, which in the grand scheme of things was like yesterday.

Either way, this game won't be one I will be playing. I haven't played any in the series and won't be touching this one either. I don't do 3rd person. I almost quit Far Cry 6 when it briefly went 3rd person in camps. I simply do not do that ****.
Mid range last gen? Is there a model lower than the 5600? I know there are APUs, but Even there I thought the performance per core was equal to the 5600, and they just included a GPU.

As for 3rd person, agree 100%
 
We'll see how this pans out but I recently saw how my old 4930K (@4.3 GHz) still managed to be a significant source of bottlenecks for Spider-Man remastered. Ultimately I couldn't even enable RT with it. It barely manages 60 FPS/4K with a 3090 TI while the newer modern rig is pushing 90-110 FPS with a similar, but slightly faster, card. There are a bunch of other differences(SATA III SSD-2133 MHz DDR3, older instruction sets/chipsets) but it was a good indicator of how we're getting to the point that a demanding game will need a newer platform.

I think they're doing a roundabout way of saying you need a DDR4/DDR5-PCIe 4.0/5.0 with a moderately fast nvme and a CPU-motherboard that won't bottleneck either mem or drive plus a mid-high GPU. I'd say that we're not only past quad cores now but also getting past those older platforms as well. I'm sure less demanding games will be happy on them for some time to come though.
 
We'll see how this pans out but I recently saw how my old 4930K (@4.3 GHz) still managed to be a significant source of bottlenecks for Spider-Man remastered. Ultimately I couldn't even enable RT with it. It barely manages 60 FPS/4K with a 3090 TI while the newer modern rig is pushing 90-110 FPS with a similar, but slightly faster, card. There are a bunch of other differences(SATA III SSD-2133 MHz DDR3, older instruction sets/chipsets) but it was a good indicator of how we're getting to the point that a demanding game will need a newer platform.

I think they're doing a roundabout way of saying you need a DDR4/DDR5-PCIe 4.0/5.0 with a moderately fast nvme and a CPU-motherboard that won't bottleneck either mem or drive plus a mid-high GPU. I'd say that we're not only past quad cores now but also getting past those older platforms as well. I'm sure less demanding games will be happy on them for some time to come though.
When I finally replaced my 2600k to go 7700k years ago I doubled the performance of the 1070 I had in it. So yea I think that is to be expected after a few generations that the IPC and I/O uplift is needed.
 
Mid range last gen? Is there a model lower than the 5600? I know there are APUs, but Even there I thought the performance per core was equal to the 5600, and they just included a GPU.

As for 3rd person, agree 100%

Well, I mean, the arch is the same in the same generation, but usually the higher end models are binned to clock higher and have more cores.

I just went over to Wikipedia now. There is apparently a Ryzen 5 5500 which has less cache, a slightly higher base clock and a lower boost clock than the 5600.

Honestly, I have so little interest in the lower models, that I didn't even realize they had dropped the non-APU Ryzen 3's in the Zen3 gen.

Still, the APU models are CPU's, and there are two Ryzen 3's which are technically lower end than the 5600, even if it is only in as much as theem being 4C/8T. And that can be quite significant. It's not 2010 anymore. All else being equal, there can be quite the penalty in games for dropping below 6C/12T, and to be safe with a small amount of margin, I'd argue its best to stick with 8C/16T or above.

I guess I think of i7's, i9's, Ryzen 7's and Ryzen 9's as high end models. i5's and Ryzen 5's as mid range models, and i3's and Ryzen 3's as low end. Anything below that (when they pop up) like Pentium's, Celeron's and Athlons asre great for little utility boxes, but not something I'd put in a desktop.
 
The system recommendations are indeed quite modest if we assume the game is targeted at self-built PC enthusiasts who hang out in hardware forums. ;)

Zen 3 and Rocket Lake are platforms that would've been purchased within a couple of years of the game's launch. The introduction of a new platform means little until it's been available long enough to achieve significant adoption. I'm completely indifferent to the requirements of a game I don't even intend to play, but I think people are out to lunch if they think the requirements are or will be trivially met by the majority of systems. I don't think that's a bad thing; there are plenty of other games to choose from.

I'm interested to see whether the newer games coming out will push more PC gamers to consoles, where one is (at least in theory) guaranteed n years of platform support without having to worry about hardware and software compatibility issues. The cost of a decent PC for gaming has skyrocketed in recent years, and not only because of GPU prices.
Mid range last gen? Is there a model lower than the 5600? I know there are APUs, but Even there I thought the performance per core was equal to the 5600, and they just included a GPU.
The AMD Ryzen™ 5 5600X is classified as a "mainstream" part by AMD and priced accordingly, but I don't think that distinction is terribly important unless the extra cores from the higher end models would make a big difference. I'm not at all familiar with the game and so can't comment on that. 🤷‍♂️
 
I'm interested to see whether the newer games coming out will push more PC gamers to consoles, where one is (at least in theory) guaranteed n years of platform support without having to worry about hardware and software compatibility issues. The cost of a decent PC for gaming has skyrocketed in recent years, and not only because of GPU prices.
I think the 2 ecosystems will be around for quite a while longer but I often wonder about this too and I agree about the prices. I still remember how with every gen of GPU for the last 5-10 years now I've heard someone complain about what they used to pay. Before scalpers, miners, and the pandemic, this was happening even if it wasn't as extreme. Now we have the rest of the builds going through similar price jumps (albeit not as extreme in most cases) and it is an expensive hobby. I remember how it was a pleasurable challenge at one point to make the $1K build that could do a lot. Now $1K basically gets you in the door with console-level performance.
 
Imo if you got a better CPU then the minimum recomended you should be fine, it's the GPU that might cause more issues.
 
Imo if you got a better CPU then the minimum recomended you should be fine, it's the GPU that might cause more issues.

That is certainly always the case for me, but I have been playing my games at 4k since 2015.

Before that I was at 2560x1600 since 2010.

And before that at 1920x1200 since 2005.

With the exception of like two years in 2008-2010 and another two years from 2013-2015 I've pretty much been seriously GPU limited for 17 years :p

But so many kids these days feel like they aren't getting the full gaming experience unless they are maxing out their 165hz monitors, (which is ridiculous by the way) which often shifts the bottleneck to the CPU again...

Luckily I don't have that problem, which is why I could use my 6C/12T Sandy Bridge-E i7-3930k from 2011 until 2019, and my current Zen2 Threadripper 3960x is likely to be with me for a while.
 
Yea I'm going to ride my 5900x at least through this generation... I MIGHT get a new video card again though... sigh... I'm hopeless. lol.
 
That is certainly always the case for me, but I have been playing my games at 4k since 2015.

Before that I was at 2560x1600 since 2010.

And before that at 1920x1200 since 2005.

With the exception of like two years in 2008-2010 and another two years from 2013-2015 I've pretty much been seriously GPU limited for 17 years :p

But so many kids these days feel like they aren't getting the full gaming experience unless they are maxing out their 165hz monitors, (which is ridiculous by the way) which often shifts the bottleneck to the CPU again...

Luckily I don't have that problem, which is why I could use my 6C/12T Sandy Bridge-E i7-3930k from 2011 until 2019, and my current Zen2 Threadripper 3960x is likely to be with me for a while.
We have a lot of similarities with those timelines and resolutions. Not exactly the same but pretty darn close. I totally agree with the ridiculous refresh rate dragons so many are chasing these days. I've got displays in the house ranging from 120, 144, and 200 Hz, and can honestly say I'm happy at 120.

Yea I'm going to ride my 5900x at least through this generation... I MIGHT get a new video card again though... sigh... I'm hopeless. lol.
I've been horrible with that since the 1080 Ti. I upgraded every GPU gen since and even did the step up to get to a 3090 Ti and double dipped for the Hybrid. I'm glad it's all paid off but I have no intention of a new GPU with ADA (although the rumored specs sound impressive). At this point my eyes are on the horizon for a new build so I can retire the 4930K rig. I'm honestly seeing some significant bottlenecks at 4K with max RT settings now. I know it's hard to believe but even 4K 60Hz can now have issues with max ray tracing effects in games for older CPUs. I'm guessing there must be some extra workload on prepping those frames with all the extra RT calculations. I'm even limiting frames, lol, to help it out a bit.

I miss 16:10 sooooooooooooooooooooooo much. Gawd do I miss it.
I've got a Samsung 25" 1920x1200 monitor that I repurposed for work. I can't remember the model # but we picked it up around 2006-2008 during BF at BB. I was looking for a smaller flatscreen for the bedroom and gaming and came across it in the monitor section. It was cheaper and better than the Samsung TVs at the time and was actually marketed as a monitor with a built-in tv tuner (had VGA, DVI-D, and multiple HDMI ports). It was the last display my old P4 rig was ever used with. Some fun times with that thing but now it's helping my old eyes with the hours I have to spend on website portals and databases for the day job.
 
We have a lot of similarities with those timelines and resolutions. Not exactly the same but pretty darn close. I totally agree with the ridiculous refresh rate dragons so many are chasing these days. I've got displays in the house ranging from 120, 144, and 200 Hz, and can honestly say I'm happy at 120.


I've been horrible with that since the 1080 Ti. I upgraded every GPU gen since and even did the step up to get to a 3090 Ti and double dipped for the Hybrid. I'm glad it's all paid off but I have no intention of a new GPU with ADA (although the rumored specs sound impressive). At this point my eyes are on the horizon for a new build so I can retire the 4930K rig. I'm honestly seeing some significant bottlenecks at 4K with max RT settings now. I know it's hard to believe but even 4K 60Hz can now have issues with max ray tracing effects in games for older CPUs. I'm guessing there must be some extra workload on prepping those frames with all the extra RT calculations. I'm even limiting frames, lol, to help it out a bit.


I've got a Samsung 25" 1920x1200 monitor that I repurposed for work. I can't remember the model # but we picked it up around 2006-2008 during BF at BB. I was looking for a smaller flatscreen for the bedroom and gaming and came across it in the monitor section. It was cheaper and better than the Samsung TVs at the time and was actually marketed as a monitor with a built-in tv tuner (had VGA, DVI-D, and multiple HDMI ports). It was the last display my old P4 rig was ever used with. Some fun times with that thing but now it's helping my old eyes with the hours I have to spend on website portals and databases for the day job.
I’m still using my 2005 1920x1200 SyncMaster 2443bwx. 1920x1200 is the largest supported resolution on my dock for non primary displays, and I love having the extra vertical space for a monitor used for email / slack.
 
I miss 16:10 sooooooooooooooooooooooo much. Gawd do I miss it.

Yeah, 16:10 is still my favorite aspect ratio.

Unfortunately it doesn't make sense to have two different panel aspect ratios that are so close. 16:10 would be much lower volume than 16:9, and thus much more expensive. There are a few professional IPS 16:10 screen sleft, but not many, and they are - as expected - more expensive compared to their same size/resolution 16:9 equivalents.

It's sad, but it is the reality of manufacturing and economies of scale :(

The funny thing with all of this is that I keep making the same mistake.

In 2005 when my trusty old 22" Iiyama Visionmaster Pro CRT died, I bought a Dell 2405FPW, 24" 1920x1200 screen. My GeForce 6800GT was instantly unable to handle the resolution in games. It struggled with titles like Half Life 2. I had previously used CRT's and was used to just dropping the resolution if needed, but this was not a great approach on LCD's, especially the early ones with comparably poor upscaling compared to today.

This actually led to me almost entirely dropping out of games. I had planned on upgrading the GPU and kind of moving on, but then life happened, and from like 2005 until like 2009 I barely played any games at all, and didn't even upgrade my desktop.

I did eventually upgrade my old Athlon 64 system to an i7-920, in a Shuttle SFF all in one case but because I was utterly out of gaming, I didn't even bother sticking a real GPU in it. I got a weak sauce GeForce 9200 GT just for basic display output. Then one day when I was bored, I fired up Civilization IV, and I got hooked into games again.

The 9200GT could actually barely handle Civilization IV at 1920x1200, but not much more than that. I got a fanless Radeon HD 5750. (Maybe I play a little bit of games?) It made Civ4 more playable and even got me through Half Life 2.

And then I shot myself in the foot and once again upgraded to a screen with a resolution so high it was difficult to drive. I got my 30" Dell U3011, a 2560x1600 screen in 2010, and then spent the next few years trying to un-GPU limit myself.

It was quickly apparent it wasnt up to the task of 2560x1600 in anything new. I picked up a GTX470 and as random luck would have it got a golden sample overclocker, that beat out GTX480's at stock clocks. At max overclocks it could just barely get me to 60fps at 2560x1600 in many titles, and just fell flat in some newer titles I wanted to play. Constant struggle of tweaking for playable settings.

Within a few months I decided I needed a GTX580 (2560x1600 was really hard to drive) but I struggled providing it with enough power. I was using a Shuttle SFF case at the time, and there simply weren't large enough power supplies for that small form factor. I tried using a drive bay supplemental power supply but it was really ****ty. I sold the GTX580 because I simply couldn't handle it in my system at the time, and went back to the 470, but immediately started scheming. I decided I was going to have to ditch the small form factor and build myself a real PC again.

At this point we were halfway through 2011, and the hype train for the upcoming Bulldozer CPU from AMD was in full force.

I decided to build a Phenom II X6 1090T into a 990FX motherboard (the new chipset specifically for Bulldozer that was launched about 6 months before the CPU) This way - I thought I could just drop in a new Bulldozer at launch, and things would be awesome, because Bulldozer is going to be CRAZY! 8 beefy cores? Can you imagine? :p

I got a big fat three slot Asus DirectCUII Radeon HD 6970. it improved things, but the 2560x1600 resolution was still a struggle.

I decided that the single 6970 wasnt cutting it. There really wasnt much that was faster on the market at the time though, so I got a second three slot DirecCUII 6970 and ran them in CrossfireX. Two GPU's, six slots. It was pretty packed in there. Seen below with the weirdo MainGear/CoolIT 180mm AIO radiator in one of those 180 degree rotated Silverstone cases. The large fans, large radiator and huge GPU's just throw the scale off. This thing was bigger than it looks.

1663167046141.png

Then Bulldozer hit and was a massive disappointment. I was disappointed and just wanted something that would work well. My philosophy ever being "I don't do downgrades" I decided I wasnted 6 cores, just as with the Phenom II, so I ordered a hexacore Sandy Bridge-E i7-3930K, which I was able to overclock to between 4.8 and 5.0 Ghz depending on the season. (winter temps got me to 5Ghz, but in the summer I had to drop to 4.8) The thing was an absolute beast for 2011.

The positive here was that the X79 platform was much better able to handle the Crossfire. I think - at least at the time - utilizing crossfire put more of a load on the CPU than a single GPU, and this just worked better on the beastly x79.

But it still wasn't enough for consistent 60fps in all titles at 2560x1600. That and Crossfire was buggy in many games, and didn't really deliver on the scaling as I had hoped.

So, in early 2012 I bought a Radeon HD 7970 at launch. Peak framerates and even average framerates were a little lower than the two 6970's in crossfire, but 1% or minium framrates saw a huge boost, and made most titles I was into much more playable. I was almost there. I just needed to squeeze a little bit more out of the 7970. I decided I was going to ghetto mod it, and attach it to an AIO cooler, to better cool it, and overclock the crap out of it. My ghetto mod successfully fit a Corsair H80 to it, but in doing so I slipped with my screwdriver and cut some traces on the board of the 7970 and was never able to repair it. I was bummed out, and went without a GPU for several weeks until the GTX680 launched and I bought it at launch.

The 680 was similar to the 7970. Almost at my magical consistent 60fps level, but not quite.

FINALLY by 2013 a product launched that got me all the way there. The original Kepler Titan. $1k was a bit much for a GPU (Little did I know what was to come in a few years time) and I felt irresponsible spending it on a GPU, but it FINALLY solved my consistent 60fps at 2560x1600 problem. I had been fighting this 2560x1600 monitor for almost 3 years and finally wound up with a solution that worked.

It only took me 2 more years to completely forget about the new high resolution early adopter penalty and do it all over again in 2015, when I upgraded to a 4K TV (Samsung JS9000) as my primary monitor.

This time I was a little bit wiser though, so I skipped the early steps. I went straight for dual 980ti's, and since I wanted to be able to overclock the snot out of them I got Corsairs official HG10 N980 "mount your corsair AIO to your GPU" bracket for the 980 series, as well as two 140mm Corsair H90's, one for each GPU. (I installed them very carefully this time :p )

It was tricky to make it fit in my new Corsair 750D Airflow case, but I hollowed out some old fans and made plenum/spacers out of them, to distance the radiator from the front of the case, so the tubes were long enough :p

1663168325930.png

1663168348938.png

It took a while to find the really long screws that would go through the case, through the outer fan, through two butchered fans acting as plenums and into the radiators, but once I did it worked quite nicely. Sadly I don't think I have any pictures of the final results. :(

I even figured out how to create my own wire harnesses to read the PWM signal off the GPU and feed it to 140mm fans in push-pull, two per GPU allowing the GPU to still control the fan speed.

1663168361664.png

1663168371482.png

Strangely enough I cant find any pics of the completed build.

I struggled a little, as Corsair had some tolerance issues with these brackets, but I eventually got everything working.

And just like before, while Nvidias SLI worked better than Crossfire, it was only ALMOST enough for 4k :p

Once again, the Titan came to the rescue. In 2016 the Pascal Titan X was launched, and I jumped on it at launch, this time deciding it was time for a custom water loop.

1663168613931.png

I was once again (just barely) making my 60fps minimum requirement work, this time at 4k. But it didn't last long. As new titles came out, I was quickly struggling with framerates again, and being GPU limited. Eventually the CPU started aging out on me as well. I think it was Far Cry 3 which actually saw the CPU limit me from hitting 60fps in some scenes with lots of **** going on.

The i7-3930k got upgraded to a Threadripper in late 2019 (which was a long saga of problems thanks to ****ing Gigabyte and their defective rev 1.0 Threadripper Motherboards that fried CPU's. This required a fresh water loop in a new case.


1663168994370.png


And then late last year, the Pascal Titan X gave way to an XFX Speedster 6900xt with a preinstalled EK water block which just barely fit with the existing loop layout :p

1663169115506.png

Now I am back in my comfort zone again, with the overclock on this puppy, I am consistently getting above 60fps in even new more demanding titles. But this probably won't last for long.

I think I see a 4090 in my future (unless they are stupid-priced)


One things is for sure, through all of these trials and tribulations I have finally learned my lesson. Never again will I early adopt a new higher resolution. Not that it really matters anyway. I don't see any need to ever exceed 4k.

I don't ever want to struggle with just barely getting the minimum frame rate again.


****, that wound up being a larger post than I had planned. Back to work :p
 
I’m still using my 2005 1920x1200 SyncMaster 2443bwx. 1920x1200 is the largest supported resolution on my dock for non primary displays, and I love having the extra vertical space for a monitor used for email / slack.
I'm at work right now and sitting in front of it and it is SyncMaster. :)

SyncMaster T260HD
 
Become a Patron!
Back
Top