Cyberpunk 2077’s Ray Tracing: Overdrive Mode Brings RTX 4090 to Its Knees with DLSS Off at 16 FPS in 4K

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
8,059
Points
113
Yesterday NVIDIA released a new video showcasing Cyberpunk 2077’s Ray Tracing: Overdrive Mode which also showed demanding it can be. In a don't blink or you'll miss it moment NVIDIA showed how even its flagship graphics card struggles with the new mode.

The GeForce RTX 4090 is brought to its knees when DLSS is turned off with the game in 4K to a staggering 16 FPS. However, when DLSS 3, aka Frame Generation, is turned on, the RTX 4090 recovers gracefully to some impressive averages ranging from over 110 FPS to 131 FPS in 4K.

See full article...
 
Not surprised here. This is looking like a next gen feature.
 
It's path tracing on a brand new already demanding game. I'm surprised it isn't in the single digits.

It is likely pretty close to playable with just DLSS performance mode at 4k.
 
It's path tracing on a brand new already demanding game. I'm surprised it isn't in the single digits.

It is likely pretty close to playable with just DLSS performance mode at 4k.
So it's likely playable at a lower resolution with fake frames injected. Got it. Just a matter of what upscaler you prefer.
 
The GeForce RTX 4090 is brought to its knees when DLSS is turned off with the game in 4K to a staggering 16 FPS. However, when DLSS 3, aka Frame Generation, is turned on, the RTX 4090 recovers gracefully to some impressive averages ranging from over 110 FPS to 131 FPS in 4K.

Shame they can't use all that processing power they use on DLSS for real frames even if it came out lower then DLSS.
 
I kind of had a feeling this was going to be the case back when they announced this a month or so ago. I hoped I was wrong but oh well. I'm going to be keeping an eye for the reviews though.
 
I kind of had a feeling this was going to be the case back when they announced this a month or so ago. I hoped I was wrong but oh well. I'm going to be keeping an eye for the reviews though.
Given how much impact it had on Quake 2 one should not be surprised to see modern games struggle a bit more.
 
So it's likely playable at a lower resolution with fake frames injected. Got it. Just a matter of what upscaler you prefer.
I don't understand what you mean by this.

Frame Generation generates frames between frames (fake frames)
DLSS reconstructs images from lower resolutions using multiple points of information (reconstructed real frames)

They can be combined but are separate solutions.

NVIDIA really shouldn't have used DLSS 3 branding for Frame Generation. It should have been called DLFG and it's own specific tech.
 
I don't understand what you mean by this.

Frame Generation generates frames between frames (fake frames)
DLSS reconstructs images from lower resolutions using multiple points of information (reconstructed real frames)

They can be combined but are separate solutions.

NVIDIA really shouldn't have used DLSS 3 branding for Frame Generation. It should have been called DLFG and it's own specific tech.

It's probably just me but I HATE the marketing that DLSS is this great feature that unlocks games... it's playing at a lower resolution and upscaling the image. Just doing in a way that tries to maintain or increase fidelity in the game. It in no way is actually gaming at the higher resolution with increased frame rates.

ALSO frame generation is simply generating additional frames to make something FEEL smoother that isn't actually being rendered more smoothly.
 
Expected this. We don't have desktop hardware capable of real time path tracing.
 
I think that's DLSS running at 720p and upscaling to 4k. I'm sure it looks FIIIIIINE....
It's 1080p internally and all video footage was captured using 4k DLSS Performance so what you see is what you get.
 
It's 1080p internally and all video footage was captured using 4k DLSS Performance so what you see is what you get.
I would debate that. You're talking about taking a 1080p image, upscaling it to 4k, THEN putting it through youtube compression algorithm. I would say what you see is to be taken with a grain of salt as it may be better or worse than what you get depending on a lot of things... basically... if you can't test it in person maybe don't use this to guide your purchase.
 
I would debate that. You're talking about taking a 1080p image, upscaling it to 4k, THEN putting it through youtube compression algorithm. I would say what you see is to be taken with a grain of salt as it may be better or worse than what you get depending on a lot of things... basically... if you can't test it in person maybe don't use this to guide your purchase.
Hard agree - though it's probably as good as you're going to get unless you have access to the losslessly compressed captures that were used to generate the YouTube video. I hear over and over that rendering 1080p to 4k and uploading that to YouTube provides better results than just uploading a 1080p source video.

Expected this. We don't have desktop hardware capable of real time path tracing.
Well, not when bolted on to an existing game engine, at least. Where we're at now, it is probably possible to do real-time path tracing with a dedicated engine, but as we've seen that doesn't seem economical to do (yet).

So it's likely playable at a lower resolution with fake frames injected. Got it. Just a matter of what upscaler you prefer.
If you're running a 600W 4090 with sub-ambient cooling (these people exist), then yeah, I get not really liking the algorithmic and duplicative shortcuts used to increase framerate.

But if you're not one of those folks, then these techniques improve the end-user experience. They're not perfect, but neither are the alternatives of significantly lowering resolution, detail, frame rates, or some combination thereof.
 
If you're running a 600W 4090 with sub-ambient cooling (these people exist), then yeah, I get not really liking the algorithmic and duplicative shortcuts used to increase framerate.

But if you're not one of those folks, then these techniques improve the end-user experience. They're not perfect, but neither are the alternatives of significantly lowering resolution, detail, frame rates, or some combination thereof.

Actually looking at this... it depends on how you built your system are where you are in the upgrade lifecycle.

I suspect many of us upgrade major pieces at a time and not a all at once sort of pattern. Like my setup. I'll upgrade CPU/motherboard/Ram and potentially Storage, then do the video card, sell the old video card and put that to a monitor sort of thing.

that's literally how I moved into 1440p gaming. that and getting this 32 inch AOC monitor through Criagslist for a decent deal.

Now I'm looking at a 4k oled as my next big upgrade... maybe. I'm still a sucker for hardware.

Maybe I'll do a 4090 or something like that... I'm kind of split and it depends on what the wife and I have going on at the time.

But my point... I'm at a happy medium right now where my video card output fits what my display technology supports. Gaming is entertaining and fun on my setup. And I don't feel a NEED for better... yet.
 
After thinking about this for a while:

I'm not upset about DLSS, even DLSS 3. As a lot of folks here have pointed out - it's frames and having a better experience, no matter how you get there, that should be important. And, in select titles where it's implemented well, DLSS can do that.

But -- my beef: nVidia has locked it to their brand, and furthering that beef, they are now generation locking different parts of it. If I were a developer, I would run far far away from DLSS just because of this - even though nVidia has the lion's share of the market.

That said, not sure you can make a valid comparison between DLSS and FSR. You ~should~ expect DLSS to produce a better result when both are available, because DLSS has super-secret first party access to all parts of the video card from the manufacturer, whereas FSR is a generic solution that is vendor agnostic. That doesn't necessarily make DLSS "better" - try running DLSS on the 7900XTX or Arc 770 and see how it does compared to FSR - and you'd see those charts look a lot different when you have to stick big goose eggs in on your geometric average of frame rates.

Is DLSS exciting? Well, kind of. It's spun off competition, FSR and XeSS, so I think there's something there. I'm not all that excited about it because it requires developer implementation, so it's not universally faster, it has tradeoffs, and it's vendor locked ~and~ generation locked. I just hate these proprietary, locked solutions. As a customer they make me feel trapped by a purchase, not glad I made the purchase, nor do I feel like I'm missing out.

Also, it makes for some really bad marketing claims. The whole "Ada is 3x faster" crap, when that only applies when a game can use DLSS3. And now we are already starting to see hints of that with 5000 series - I just saw a "2.6x faster" rumor, and I can't wait to see that's only when a game can use DLSS4, which will almost certainly be locked to the 5000 series, and they will use that level to only manufacture high end, high cost cards for as long as they can milk it. And that capitalistic milking leaves a really bad taste in my mouth as well - I hate as a customer knowing I'm getting taken advantage of.
 
Become a Patron!
Back
Top