Unreal Engine 5.6 Makes Its Debut with New Footage from the Witcher IV Tech Demo, Showcasing Improved 60 FPS Ray Traced Global Illumination

Peter_Brosdahl

Moderator
Staff member
Joined
May 28, 2019
Messages
9,577
Points
113
CDPR unveiled tech demo footage for The Witcher IV at Unreal Fest 2025, showing off the latest engine's capabilities on a PS5. Basically, UE 5.6 sounds like it has had a major overhaul intended to decrease development time for games, movies, and other animation tasks and now includes device specific profiles.

See full article...
 
“First and foremost — what we presented at the State of Unreal 2025 is not gameplay footage from The Witcher 4. It’s a standalone tech demo, built to showcase the Unreal Engine 5 features and tools we’re using to develop the game’s open world.”
Why do they insist on making life difficult for themselves? Many are already dunking on them for this disclaimer, and rightly so. The majority of viewers will never read the disclaimer and will feel like they were mislead when the gameplay is nothing like this.

If it's not representative of what the game will be just don't use the witcher branding and characters for it. Make it an off-brand tech demo.
 
Yeah, I think it's a little dangerous setting such a high bar ("within our 60FPS budget!") given just how rough CDPR console releases have been performance-wise.

On the other hand, it's also a bit promising, isn't it?
 
I don't mind if they have "a" target for current gen consoles as long as there's a clear path for PC and not a port from. The last two games were obvious in their PC demograph to such a degree that each needed years before hardware could catch up and I really learned to enjoy that. As much as I've enjoyed Horizon and GoW I'll be very bummed if that's the peak of what to expect. I think this tech demo does look good but it does, somewhat, give me that vibe but I'm still hoping for the best.

The amount of time I've spent on this franchise is ridiculous, well over a 1000 hours for it all, and I've used pretty much every TW3 Halk Hogan mod since 8 or 9 (I honestly don't remember anymore), so I'm pretty confident I know what I'm looking for. I was able to spot some very familiar looking textures with this demo video but I was also impressed with the lighting/shadows as well. We'll just have to wait and see and I, and I'm sure many others, have high expectations for this next game.

Meanwhile, the hype for UE 5.6 seems promising and I hope it delivers on promises to make it easier for developers to do more while being less taxing on hardware. I'm not against UE unless it does the opposite and that has been more true than not, at least it seems anyway. You pretty much have to hit it with the biggest hammer in the shed to enjoy the most with each game release, and although it can be nice watching games age like a fine wine, there's also many where it just doesn't seem optimal to begin with but to be fair UE is far from being the only engine that can do that.
 
We still can't run Cyberpunk fully cranked at 4K120, right?

I expect that TW4 will be slimmed to just barely hit their target on consoles - and that there'll be plenty of knobs to crank the insanity (and framerate and resolution and detail) on PC
 
We still can't run Cyberpunk fully cranked at 4K120, right?

I expect that TW4 will be slimmed to just barely hit their target on consoles - and that there'll be plenty of knobs to crank the insanity (and framerate and resolution and detail) on PC
I mean one can only hope. :) and love the Cyrix reference in your sig... many don't remember that company and the many patents they held.
 
This is nothing new, every now and then we get a Tech Demo, and everyone goes crazy expecting it to become a real game. That never happens, at least not with all the bells and whistles.

I recall multiple times when "console games" were actually running on ultra high end PCs, at least they are running them on an actual console. I'm pretty sure there's some dynamic LOD, upscaling and then some.
 
I mean one can only hope
What if I hope against it? I mean UE has been cranking up polygon count and detail in places where it matters very little for the image quality. Or the same or better result could be achieved on a lesser computation budget by optimizing. UE's approach is brute force and use tricks to make up for the lack in hardware. They are hand in hand in this with nvidia, and I think it is a terrible direction esp. for PC gaming. You spend more on graphics HW for less benefit than ever before compared to consoles.
 
What if I hope against it? I mean UE has been cranking up polygon count and detail in places where it matters very little for the image quality. Or the same or better result could be achieved on a lesser computation budget by optimizing. UE's approach is brute force and use tricks to make up for the lack in hardware. They are hand in hand in this with nvidia, and I think it is a terrible direction esp. for PC gaming. You spend more on graphics HW for less benefit than ever before compared to consoles.RT which take the ake
Actually, polygon count is the least of the problems, GPUs have polygon crunching power to spare (and have for several generations now). Its shaders, textures and more recently RT which take the cake. Do you remember when Anisotropic filtering and AA brought gpus to its knees?
 
I'm pretty sure there's some dynamic LOD, upscaling and then some.
They go into some of that in the video, and this is just basic optimization stuff these days, right?

They are hand in hand in this with nvidia, and I think it is a terrible direction esp. for PC gaming
This was done on a console with an aging AMD APU... which means that if ported to PC, it'll run well on just about anything.
 
They go into some of that in the video, and this is just basic optimization stuff these days, right?


This was done on a console with an aging AMD APU... which means that if ported to PC, it'll run well on just about anything.

Yeah, that's what logic would lead you to believe... and then reality hits hard. ;) :D :p
 
Actually, polygon count is the least of the problems, GPUs have polygon crunching power to spare (and have for several generations now). Its shaders, textures and more recently RT which take the cake.
That's a myth, you can't seriously believe that you can increase poly count ad infinitum with no consequences. Polygon topology very much matters and affects a ton of other things like lighting and texturing as well.

And that is exactly what nanite is doing, adding a ton of complexity to scenes. Polygons don't just exist in a vacuum separate from every other rendering pass..
Do you remember when Anisotropic filtering and AA brought gpus to its knees?
From the first days, I used to crank up AF to 16x, because it had very little impact on performance, and I was always dumbfounded why anyone would go without it, so I can't say I remember when AF used to bring GPUs to their knees. But this is besides the point. Am I understanding it correctly, you are saying, there is no need for optimization because AA used to tank performance 20 years ago?
 
Become a Patron!
Back
Top