RTX 2080 Ti Struggling with Cyberpunk 2077’s Ray Tracing at 1080p, Even with DLSS 2.0?

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,222
Points
113
cyberpunk-2077-lizzies-bar-1024x576.jpg
Image: CD PROJEKT RED



CD PROJEKT RED hasn’t released the official PC requirements for Cyberpunk 2077 yet, but its recommendations may turn out to be a real doozy, particularly for those who want to enjoy the sci-fi RPG at its highest splendor (i.e., ray-tracing effects fully enabled).



Germany’s PC Games Hardware (PCGH), which was fortunate enough to play a recent build at one of yesterday’s press events, claims that the game doesn’t run at a consistent 60 FPS and is “jerky,” especially during driving sequences. Now, this is perfectly reasonable for a ray-traced game that’s still being optimized and bug tested, but the hardware involved was NVIDIA’s flagship GeForce GPU, the RTX 2080 Ti – and not only that, the build was supposedly running at only 1080p...

Continue reading...


 
would be nice to have a game that pushes hardware a bit more then usual
 
What's interesting here is I watched another review that said it was smooth and the frame rate went higher during driving somehow. So conflicting answers. Though both statedn1080p with rt and dlss 2.0.
 
No worries! The 3080ti will take care of it for a mere 1599$!.
 
This is why there should be SLi.
It's about time someone released a game that kicks the current GPUs in the ***.
 
This is going to be nvidias die by the fire title this year and potentially next year. They will be optimizing the ever loving crap out of this game with driver releases and everything else.
 
This is going to be nvidias die by the fire title this year and potentially next year. They will be optimizing the ever loving crap out of this game with driver releases and everything else.

I was thinking this also. They really need to make this title work.
 
I was thinking this also. They really need to make this title work.

the problem is there are users like me that said... ok I want to game at 1440p not 4k+... so for me the 2080 series not the 2080ti series makes more sense. (Pre super launch) Now I'm looking at a game coming out that may be too much for my 2080. I seriously doubt the rest of my system is going to be any bottleneck for this game.

Though to be honest... I would have been more impressed if the demo was taken with a PS5 or PS4 even. That would have shown some real optimizations but I don't think Nvidia wants that. And from all of the PR stuff it's very clear this is going to be an Nvidia first title.... though Microsoft and Sony might have something to say about that.
 
This is why there should be SLi.
It's about time someone released a game that kicks the current GPUs in the ***.
No worries! Nvidia will take care of it in SLI for a mere 3098$ after 100$ mail in rebate!!!. The new 3080ti, with unicorn rtx, AGAIN the new future of gaming.
No **** it better run well on 2060 and up cards, anything else is bananas... Afterall RTX on the 2000 series was for the games that are coming.
 
CDPR is usually pretty good at making games that scale across different GPUs and platforms but yeah if you want the bells and whistles it usually takes top-end cards at release and for years after. I know it's a bit of a fallback to toss resolutions around but things done at 1080p have changed a lot since it started becoming mainstream nearly 10 years ago. Back then you could get by with 256MB or 512MB card for most things. These days the number of textures, shadows, and lighting, and other calculations, rival CGI in some movies back then and they're being done on the fly instead of rendering farms taking days. If we try to gauge by that history than we have at least another 5 years to go before GPU's really catch up for 4K.

I do agree though. It's a shame SLI is dead. Sure it was a PIA for the game dev's to work out but it at least it gave us consumers a path beyond what one GPU could do. I'll never forget how great 2 x 970's were at 1080p, or 1440p, or when someone had 2 x 980 Ti's outperform a Titan for less money, and so on. All the benching and tweaking I've done with RDR2 is a good example of how we already have games that will need at least another 2-3 gens before a single card can truly crush it in any resolution at maximum settings. Cyberpunk 2077 will be no exception. The Crysis remaster, probably the same. The trend of games demanding more than what current tech can provide is far from new either. That meme for Crysis started for a reason and stuck for the same. Many, many, other games followed after and did the same. The saddest part of PC gaming is that you can buy a game at release but it could be another 3-5 years before you might have the equipment to see it at its best.
 
Well I mean you CAN sli your cards... a couple of quattro 6000's with a SLI'd combined 48gb of video memory should be able to handle Cyberpunk 2077 at 4k with RT at a solid 30fps right? And that's only 12 grand in video cards.
 
Well I mean you CAN sli your cards... a couple of quattro 6000's with a SLI'd combined 48gb of video memory should be able to handle Cyberpunk 2077 at 4k with RT at a solid 30fps right? And that's only 12 grand in video cards.

Unfortunately, that's not how it works. I think you know that, but my sarcasm detector doesn't work.

The fact is, unless the developers can implement explicit multi-GPU, SLI is effectively off the table. I'm also hoping the game has Vulkan support as it tends to out perform DX12 by a fairly significant margin. The images with and without Ray Tracing indicates a substantial drop in graphics quality if you disable RTX. I'm glad the game will push hardware to a degree we haven't seen since the original Crysis. The unfortunate side effect of doing that is that we might not really get to see this game perform and look as good as it could for a graphics generation or two.
 
Yea I know I was mostly being snarky @Dan_D . I'm chock full of snark as of late.
 
Become a Patron!
Back
Top