NVIDIA GeForce RTX 3090 Can’t Run Cyberpunk 2077 at 60 FPS in 4K Ultra without DLSS

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,067
Points
83
cyberpunk-2077-screenshot-purple-club-1024x576.jpg
Image: CD PROJEKT RED



Is Cyberpunk 2077 the modern Crysis? Tom’s Hardware has shared some early benchmarks for CD PROJEKT RED’s latest masterpiece, and it appears to be extremely demanding at the 4K Ultra setting—so much so that NVIDIA’s flagship GeForce RTX 3090 can’t hit 60 FPS, even with ray tracing turned off. If there’s any title that truly requires DLSS, this appears to be it, as green team’s BFGPU is completely crushed at the native maximum preset when RTX is enabled.



You can check out the benchmarks below, which are mildly confusing based on...

Continue reading...


 
Nothing like a little clickbait regarding the most anticipated release of the year to start out the week. Shame because there is still some genuinely good reporting done Tom's occasionally. Say what you will about Tech Jesus, but I think he is one of the few honest tech reviewers left out there aside from The FPS Review.
 
Hahaha.....caught publishing data that doesn't mean sh!t
But really now, it's supposed to be demanding, that's what used to be the norm.
How many times did you replay a video game even 5 years ago, on new hardware and get to see how it was designed to look.
At any rate it's all vaporware at the present anyway.
Review sites ought to publish the data on hardware that folks actually HAVE.
 
They need to review what is released so people whe. They can get one know what they are getting and can make an informed opinion. Beyond that the flavor videos and super overclocking articles.. less helpful.
 
Lets see what happens after the game actually releases and the 49gb day 1 patch is applied... and new game ready video drivers of course.
 
Hahaha.....caught publishing data that doesn't mean sh!t
But really now, it's supposed to be demanding, that's what used to be the norm.
How many times did you replay a video game even 5 years ago, on new hardware and get to see how it was designed to look.
At any rate it's all vaporware at the present anyway.
Review sites ought to publish the data on hardware that folks actually HAVE.
Its tomshardware, what did you expect? :LOL: :LOL: :p :p ;) ;)
 
How many times did you replay a video game even 5 years ago, on new hardware and get to see how it was designed to look.
Too true! People need to grow up and get a clue. All of the Witcher games absolutely crushed the GPUs of their time. It wasn't until I got a 2080 Ti I could even play W3 in 4k/60+ FPS with max settings and mods. W2 with ubersampling(essentially 8K) still brings it down but that also has to do with being DX9.0c.

The only thing I ever doubted, and still doubt for most games, are the recommended/required specs. A thing a reviewer on Tom's did say is perhaps CDPR's recommendations were for those looking at a 30-40 FPS experience and I hate to say it but I agree. Publishers do seem to consistently understate, and even withhold, what they mean for PC specs. They seemed woefully under powered for this game and I doubt patching will improve more than maybe 20-30% in terms of FPS. Not to mention, remember when CDPR took a bad rap for the difference in textures from the original W3 footage vs how it looked at release. People can't have it both ways with wanting the best looking possible game and then expecting current hardware to play it at its best. Sure optimizations will need to be made but there's more to the story than that.

As far as calling the 3090 an 8K card, well, most of us already knew that would only apply to games that demanded very little. All the hype with Death Stranding should've been a good clue when they talked about the impressive numbers seen in 4K with RTX 2070's and 2060's. Those who want more than the Fortnite experience have years to go before a true 8K card is here. Honestly, it wasn't until the latest gen of cards from AMD and NVIDIA we even really are seeing great 1440p experiences so 4K still has compromises.
 
But really now, it's supposed to be demanding, that's what used to be the norm.
How many times did you replay a video game even 5 years ago, on new hardware and get to see how it was designed to look.
Agree! Going waaay back, I remember needing upgrades to really handle, say, the various Quakes and Unreals. Crysis was really the last game that I remember that really pushed hardware - give me titles that are so demanding that they inspire me to want to try multi GPU, as that makes playing with hardware so much more interesting.
 
Agree! Going waaay back, I remember needing upgrades to really handle, say, the various Quakes and Unreals. Crysis was really the last game that I remember that really pushed hardware - give me titles that are so demanding that they inspire me to want to try multi GPU, as that makes playing with hardware so much more interesting.
Before The Witcher 3, Metro 2033 was the last one I remember.
 
Well years back, CPU and GPU generations brought huge improvements. A year and a half old Computer was old, A three year old computer was obsolete. So you could publish a game that pushed hardware hard, and know that the hardware would catch up soon, and it would be a big enough jump that most of your players were going to upgrade shortly.

Now... you just don’t have that same leap in performance generation over generation. Users are holding onto hardware longer, and why would a publisher push the minimum performance barrier if many (most?) gamers aren’t going to upgrade any time soon to get there?
 
Well years back, CPU and GPU generations brought huge improvements. A year and a half old Computer was old, A three year old computer was obsolete. So you could publish a game that pushed hardware hard, and know that the hardware would catch up soon, and it would be a big enough jump that most of your players were going to upgrade shortly.

Now... you just don’t have that same leap in performance generation over generation. Users are holding onto hardware longer, and why would a publisher push the minimum performance barrier if many (most?) gamers aren’t going to upgrade any time soon to get there?
What motivation do they have to upgrade if they publishers don't push the engine? I rode my 680 and 1080ti longer than any other video cards that I've ever owned, because the games just didn't really try to push the envelope. I wouldn't be trying to upgrade my 1080ti except that I want to increase my Folding@Home output - the 1080ti plays pretty much everything at 2650x1600 without issue. Of course, I'm eying a ROG Swift PG32UQX as my next monitor, and if I snag one of those, then I'll have several generations of buying the best available video card ahead of me.
 
Well years back, CPU and GPU generations brought huge improvements. A year and a half old Computer was old, A three year old computer was obsolete. So you could publish a game that pushed hardware hard, and know that the hardware would catch up soon, and it would be a big enough jump that most of your players were going to upgrade shortly.

Now... you just don’t have that same leap in performance generation over generation. Users are holding onto hardware longer, and why would a publisher push the minimum performance barrier if many (most?) gamers aren’t going to upgrade any time soon to get there?
The generational leaps really were not much different from today. Historically, generational leaps between video cards has averaged 30-40% for the 15 years I have data for. It has actually been trending up over the last few generations, believe it or not. Primarily thanks to the jumps to Pascal, Ampere, and Navi.

CPU performance is a different story, but there are too many data points for me to want to compile in my free time on that front.
 
Just waiting on 256 bit computing. That should be a massive bump.
 
Nothing new here, people want the best graphics but then also want it to run on their integrated potato. Well you can't have it both ways. I actually prefer a future proof game than one that runs at 60 fps on a mid range card. That means there is untapped potential. Ever since GTAIV I've been fighting this battle.
 
Nothing scientific to report since I didn't have any OSD running and only had about 10-20 minutes to play but it seemed to play just fine on my 3700x/3090 rig. Check settings and everything defaulted to Ultra, even has 32:9 support out of the box. I left DLSS to auto. Honestly seemed pretty smooth. I'm sure the guys will do an in-depth review but that was my first impression. I haven't even updated to the game-ready driver yet but probably got the patch. I'll definitely be playing more this weekend and testing on both my displays as well. DLSS seems to be the more advanced version as it seemed to be working in 5120x1440 and, so far, any game I've tested RT on, does not. Can usually tell by it being greyed out along with the huge performance hits. Can't wait to see some of those funny glitch bugs everyone's talking about though. :)
 
Nothing scientific to report since I didn't have any OSD running and only had about 10-20 minutes to play but it seemed to play just fine on my 3700x/3090 rig. Check settings and everything defaulted to Ultra, even has 32:9 support out of the box. I left DLSS to auto. Honestly seemed pretty smooth. I'm sure the guys will do an in-depth review but that was my first impression. I haven't even updated to the game-ready driver yet but probably got the patch. I'll definitely be playing more this weekend and testing on both my displays as well. DLSS seems to be the more advanced version as it seemed to be working in 5120x1440 and, so far, any game I've tested RT on, does not. Can usually tell by it being greyed out along with the huge performance hits. Can't wait to see some of those funny glitch bugs everyone's talking about though. :)
I started to notice some visual downgrades when DLSS turned super aggressive going outside with it set to Auto, so I need to do some tweaking of the settings to be able to run it on Balanced. Ray tracing looks very well done here, but it starts to look weird when DLSS goes into Ultra Performance mode. Some post-processed effects and objects also start to look jaggy.

And by the way, it seems like the game is defaulting to the Ultra preset for almost everyone no matter their hardware configuration. I don't think the game is detecting your hardware to automatically set recommended settings.
 
I started to notice some visual downgrades when DLSS turned super aggressive going outside with it set to Auto, so I need to do some tweaking of the settings to be able to run it on Balanced. Ray tracing looks very well done here, but it starts to look weird when DLSS goes into Ultra Performance mode. Some post-processed effects and objects also start to look jaggy.

And by the way, it seems like the game is defaulting to the Ultra preset for almost everyone no matter their hardware configuration. I don't think the game is detecting your hardware to automatically set recommended settings.
I wondered if autodetect was working right. Seemed a little strange considering the performance issue reports. Really looking forward to digging into this weekend.

edit: I did notice some texture pop in or changes happen as I looked around on the landscape. Could be a little related to what you saw with DLSS as well.
 
I wondered if autodetect was working right. Seemed a little strange considering the performance issue reports. Really looking forward to digging into this weekend.

edit: I did notice some texture pop in or changes happen as I looked around on the landscape. Could be a little related to what you saw with DLSS as well.
We have a discussion going on in the PC Gaming thread that asset loading is probably tied to your drive speed. Seems like even if the game is installed on a PCI-E NVMe drive that there are glitches with assets loading in, and it gets worse the slower the drive you have. The options has a setting for "Slow HDD" if that is any indication.
 
Become a Patron!
Back
Top