Crytek: No Current GPU Can Run Crysis Remastered Above 4K/30 FPS at Max Settings

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
8,878
Points
113
Crysis-Remastered-Webpage-1024x576.jpg
Image: Crytek



The long wait is over. Crysis Remastered has officially been released today, and now, everyone will have the opportunity to see what improvements have been made. After a tumultuous road, more than a few people are curious to see how modern hardware will fare with the various upgrades made to the PC version. Yesterday, there was a Crysis Remastered pre-launch stream. In it, folks got to see some real gameplay in action. We also got to hear a little bit more about the new “Can It Run Crysis?” mode.









Well, the stream brought us back to 2008 again. It seems the developer’s rig was challenged in both 1080p and 4K, and ultimately unable to provide a consistent average of 60 FPS in either resolution using the new max settings. The stream was broadcast in 1080p, which affects the visual fidelity of their presentation, but here’s a screen capture from the demo...

Continue reading...


 
DirectX 12 can certainly help, but it is good to see that this version much like the original was created with future hardware in mind. That is why the original Crysis still looks good 13 years later.

Saw a post somewhere saying that a 2080 Ti was seen running "Can it run Crysis" detail mode at 38 FPS in 4K. So a 3080 should be able to do it at 50 FPS.
 
Also, naturally different areas tested respond to performance differently, so transparency is needed on where people are testing, or if they are using the built-in benchmark, which includes 2 test scenarios.
 
Is it really an achievement to put out software that can churn process cycles?

The question is - is really because Crysis is going to run so beautifully that it needs those cycles, or did the devs just throw some random inefficient code in there to slow things down just because people expect Crysis to bring a computer to its knees.

They should have just released a 100% ray traced edition and said “there you go, suck on that for a while”
 
Is it really an achievement to put out software that can churn process cycles?

The question is - is really because Crysis is going to run so beautifully that it needs those cycles, or did the devs just throw some random inefficient code in there to slow things down just because people expect Crysis to bring a computer to its knees.

They should have just released a 100% ray traced edition and said “there you go, suck on that for a while”
Nothing inefficient about purposefully inserting code to slow a program down. Why would anyone would do such a thing? Sounds like a ridiculous question to pose.

Before the Crysis 2 tessellation controversy comes up, Crytek quickly fixed that in a patch. Their explanation as to why it shipped that way was understandable, as I could see a simple value or dev flag being missed or overlooked when finalizing the release build.
 
inefficient code in there to slow things down
Not so much that but they did say that the can it run mode set some views to unlimited. Basically the more pixels you have the more it's going to try to render. After tinkering a lot with Kingdom Come: Deliverance, which also uses Cryengine, and has a lot of view distance settings, I can verify that they can be exceptionally taxing when maxed out.
 
There are rendering techniques I hope they implement to optimize things but I'm glad they are releasing a mode that will push hardware to the breaking point just because.
DLSS should also be killer when it's in.
 
There are rendering techniques I hope they implement to optimize things but I'm glad they are releasing a mode that will push hardware to the breaking point just because.
DLSS should also be killer when it's in.
That's kind of how I feel about RDR2 on PC. That game will be crushing cards for at least 2 more gens in 4K. I'm looking forward to having a 3090 at some point but I've got no illusions about it being the one that'll fully support these games, or Cyberpunk 2077, or about a half dozen others already out. It's one of the best things about gaming on a PC. You can see your games get better through time with new hardware even when they don't need a remastered edition.

edit: Doing to many replies and noticed I didn't fully type out my above reply. That should read "no illusions about it not being one"
 
Last edited:
I'm going to grab this right now and probably start using it in my reviews. I'm interested to see how it performs on a variety of CPU's given the way it behaved on the original game. As I recall, it worked well on dual core CPU's, but quad core CPU's saw virtually no benefit to it. It's been a very long time since I've done testing with it, but that's what I recall from back in the day.

EDIT: Purchased and downloading now.
 
Last edited:
Been playing with the game since last night, the benchmark mode has 2 different run-throughs, and you can change the game settings and ray tracing setting independently. It runs the demo 4 times as fast as possible. I think it will be pretty good. I might add it to GPU reviews. The game is only DX11 though, it's multi-thread aware, but I'm not sure it uses more than 4 cores or 8 threads. My 3900X was pretty much at 20% max utilization. But more testing is warranted.
 
It's kind of moot point when you deliberately make it run slow but has nothing to show for it.
 
Nothing inefficient about purposefully inserting code to slow a program down. Why would anyone would do such a thing? Sounds like a ridiculous question to pose.
Because crytek is on the brink of bankruptcy has been for a while. Anything to try and bring back their glory days with member berries.
The code being inefficient only means that it adds features that give little practical or visual benefit but demand a lot of computational power. I assume this is what they are doing.
 
Because crytek is on the brink of bankruptcy has been for a while. Anything to try and bring back their glory days with member berries.
The code being inefficient only means that it adds features that give little practical or visual benefit but demand a lot of computational power. I assume this is what they are doing.
That is what the Epic sellout was for (guaranteed money). I find that comments point to more people who refuse to buy a game when "bad" performance numbers are publicized these days than it does to compel people to buy it, so if anything they are actively turning away potential customers if they purposefully sabotaged their game.
 
Been playing with the game since last night, the benchmark mode has 2 different run-throughs, and you can change the game settings and ray tracing setting independently. It runs the demo 4 times as fast as possible. I think it will be pretty good. I might add it to GPU reviews. The game is only DX11 though, it's multi-thread aware, but I'm not sure it uses more than 4 cores or 8 threads. My 3900X was pretty much at 20% max utilization. But more testing is warranted.
Thanks Brent!
 
We'll just have to wait and see. We've already got word today about that 3080 test being done and it pulled the same FPS at 4K whatever card they used in this test did at 1080p. To me that sounds like a 3090 could get really close to 60 FPS, and perhaps with some updates, actually get there.

I say let the numbers speak for themselves because the bottom line is there's a lot of people who love to hate and they've got a lot of energy to do it with too. Not expecting miracles with this game, and at $20 I paid, I'm happy for whatever it offers. The truth will be whatever it ends up being but I'm not about to jump on any bandwagon until I see for myself. I'm sure we've all got stories about a vendor, or review, or some YT twit, that trashed on something we were interested in only to find out that it worked for us. I will admit, though, I wasn't overly impressed with the video but that's my personal opinion and shouldn't reflect on the game at all. Considering it was a half hour long it seems like they could've gone into more depth about it.
 
That is what the Epic sellout was for (guaranteed money). I find that comments point to more people who refuse to buy a game when "bad" performance numbers are publicized these days than it does to compel people to buy it, so if anything they are actively turning away potential customers if they purposefully sabotaged their game.
It's not sabotage, they are doing it to stay relevant, to be in people's minds. Epic money is one thing, but it doesn't make you relevant.
Any developer could add features to their games that would make it run like a dog on current hardware.

What I mean is that it must also look better than everything else to be justified in running worse than every other game. Crysis when it came out did, I'm not so sure about this one.
 
Honestly, what are they doing to the quality settings to tax card to much? 16x shadows, 200% draw distance. 64x tessellation? If you put everything to medium, does it look the same and perform better?
 
Become a Patron!
Back
Top