Console Graphics Have “Plateaued,” Former PlayStation President Says: “Only Dogs Can Hear the Difference”

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,424
Points
113
The PlayStation 6, and whatever Microsoft decides to call its next-generation Xbox video game console, will not deliver graphics that are any better than what the current generation of hardware already offers, according to the latest thoughts from Shawn Layden, an industry veteran who is partially known for having helped lead the PlayStation brand as President and CEO of Sony Interactive Entertainment America from April 2014 before leaving years later for what he admitted was "exhaustion."

See full article...
 
This is one of those non-sense statements that these suit types will put out when they crave attention. Just go back to counting your money and leave the technical stuff to people who know what they are talking about, please.

How dumb can you be for a CEO? How would natural light in your room interact with the in game scene on screen, you absolute muppet?

Further proof that most rich people don't get rich because they are clever they do because they were lucky.
 
I think you could say "Graphics" have plateaued. Not really any reason to differentiate between Console or PC really.

I mean, sure, they have gotten better recently, but it's been by extremely small increments. The latest advances have mostly been in frame generation, which hasn't really been about fidelity or better resolution, it's been about boosting frame rate.

Years ago, it was a GPU for playing at 640x480 vs 1024x768 vs ... all the way up to 4K - and that was the game. All the way until maybe the last two generations. You chased resolution, that increased fidelity, and you found some middle ground with an acceptable frame rate. There were some toys that played into that along the way - anti-aliasing was probably the biggest one, but by and playable large screen resolution was the biggest difference between a budget GPU and a top tier GPU

Today - there's pretty much just 2 resolutions. Resolutions other than 1080 and 4K exist, but are definitely in the minority. 1440, Ultrawides, 8k etc. But pretty much everyone is assuming it's either going to be ultra high frame rate 1080, or 4k - and it's just a matter of how well we can cheese the AI/frame generation to get frame rates to something playable.

The game shifted from actual fidelity to cheating fidelity.

So yeah, I think they are right.
 
The game shifted from actual fidelity to cheating fidelity.
This is not a new thing, most rendering methods are about cheating in a way. Shadow maps are the cheat for real time ambient lighting, cube maps are cheat for real time reflections. Consoles have been using upscaling and checkerboard rendering for a while now too. AI upscaling is just the newest cheat.
So yeah, I think they are right.
If you read the whole statement not just the quote here it is nonsense. The guy literally says ray tracing has no effect if you play in a room with sunlight.

There is some truth to the diminishing returns statement, but is not the result of plateauing but the developers getting incompetent and lazy and using less efficient code which results in worse performance with no real improvement in graphics fidelity.
 
This is not a new thing, most rendering methods are about cheating in a way. Shadow maps are the cheat for real time ambient lighting, cube maps are cheat for real time reflections. Consoles have been using upscaling and checkerboard rendering for a while now too. AI upscaling is just the newest cheat.

If you read the whole statement not just the quote here it is nonsense. The guy literally says ray tracing has no effect if you play in a room with sunlight.

There is some truth to the diminishing returns statement, but is not the result of plateauing but the developers getting incompetent and lazy and using less efficient code which results in worse performance with no real improvement in graphics fidelity.
How much do most developers have to do with graphics at this point? Isn't like 90%+ of the industry using either unreal 5 or unity engines? I don't do any game development, but I understand that Unity is supposed to sacrifice performance for cross platform compatibility and ease of use, while UE 5 is for dedicated professionals on larger teams.
 
Brute force rendering is long gone You can double compute shaders/texture units/memory/bandwidth, but you get diminishing returns. Also, there's only so much you can do with "optimizations" and "tricks" like say tile rendering, color/memory/Z compression, culling, etc. Upscaling/frame gen is here to stay.

I just wish those were optional and not required to get 4k/60+ in most current games.
 
Last edited:
How much do most developers have to do with graphics at this point? Isn't like 90%+ of the industry using either unreal 5 or unity engines? I don't do any game development, but I understand that Unity is supposed to sacrifice performance for cross platform compatibility and ease of use, while UE 5 is for dedicated professionals on larger teams.
They might not code the renderer itself, but if it was all up to the engine then performance and graphics fidelity should be similar on all games using the same engine. But we know that's not the case. An engine is just a foundation, like a car manufacturer's platform that will see all kinds of cars built on it.
 
Partly I want the 5090 for better game performance... but I also want it for better running of my own local AI. Same reason my next build will be minimum 96 gig of ram.. 128 if I can find a decent speed matched pair of 64 gig DDR5 sticks.
 
Become a Patron!
Back
Top