4K Gaming Is a “Useless” Gimmick Designed to Sell New Tech, Says Playdead Co-founder

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,871
Points
113
Resolution and graphics fidelity are paramount to the gaming experience, right? Not so, according to Dino Patti. During last week's Reboot Develop conference, the Playdead (Limbo, Insider) co-founder suggested that 4K was merely a gimmick for enforcing hardware upgrades: “I think it’s useless,” he said. “It’s something that’s being pushed by tech vendors to push the next console.

I want to select a path by myself and not be drawn in by money from tech vendors. There’s a lot of people that see something shiny and follow that. You can definitely see all the herd walking one way and falling.

[Manufacturers] don’t respect the natural evolution of games and you force technology. Microsoft with Kinect, they paid studios to make a lot of ****ty games.


Other industry veterans at the event had similar thoughts. Klang Games co-founder Guðmundur Hallgrímsson, who is working on a new title called Seed, claimed that graphics don't matter because the most dedicated gamers "lower all settings in order to get the maximum performance." And Chet Faliszek, a Valve writer credited on Portal, Left4Dead, and Half-Life 2, said graphics don't "change the game in a meaningful way."
 
I would partially agree in certain circumstances. The people who turn all settings down are usually e-sports competitors looking for an extra edge when there are no rules specifying which resolution/settings should be used, or people using low resolutions to make hit boxes bigger. Personally, I think that there is a lot to be said for the immersion and atmosphere that can be created by a better looking scene, I don't know about 4k, but I enjoy playing at 1440p, and being able to see specific shadows, brush, etc, are a huge part of the experience; then there are the sounds, good sound can be a huge difference in the gaming experience.
 
I would partially agree in certain circumstances. The people who turn all settings down are usually e-sports competitors looking for an extra edge when there are no rules specifying which resolution/settings should be used, or people using low resolutions to make hit boxes bigger. Personally, I think that there is a lot to be said for the immersion and atmosphere that can be created by a better looking scene, I don't know about 4k, but I enjoy playing at 1440p, and being able to see specific shadows, brush, etc, are a huge part of the experience; then there are the sounds, good sound can be a huge difference in the gaming experience.
Ya 4k is an issue for the market as a whole. Gaming should have just jumped to 1440p but that is based on performance v. cost.
 
Isn't gaming in the first person all about immersion?
I play FPS almost exclusively, I want the scene to be as real as it can be.
We don't settle for "sort of seeing things" in real life, right? That why we have glasses and contact lenses.
Why should the digital world be different?
 
I like my eye candy .. I went from an RX570 8gb, to RX580 ..to a GTX 1080 to an RX5700XT .. I want as high of a fps as I can get, but eye candy takes precedence over that (unless it becomes a slideshow, then I'll sacrifice some eye candy)
 
Last edited:
One in every crowd I tell ya. Honestly been hearing someone rant about every new high resolution to come down the road since the 90's. Who needs anything beyond 640x480? 1024x768? 1080P? 1440P? 4k? At the moment I do agree about 8k but because it's so unfeasible on so many levels it's idiotic. For 4k I think it depends. As more than a few in this thread have already stated sometimes frames per second are more important. Some simply are not so obsessed on the visual detail of a game. Myself, I love my eye candy. Love it more when the hardware exists to drive it and the developers are also able to support in their textures and stuff. Is it for everyone? No of course not but I don't consider it a gimmick either. Like I said, someone's always ranting about how another high resolution isn't needed every new generation. Seriously, go back and play pong Dino.
 
The only ones who ever complain about higher resolutions are game developers.
 
if you're willing to pay bleeding edge prices for hardware to play at bleeding edge resolutions that's fine. That's niche though. The majority of people game at 1080P. I would say that 1440P is starting to pick up steam. At least in the PC realm as 2K high rate displays are coming down in price and more mid-tier cards are able to push it. 4K is still a niche. Want to push 4K you'll be spending $1000+ on a card to do it.

That being said I don't know anyone that plays online games at 4K and cares about winning. Consoles are trying to cater to that because 4K TV's are cheap as ****. It's just not realistic.
 
I'm of two minds about this.

For general desktop use - I absolutely enjoy 4K. Or, to be much more specific, I enjoy the PPI that 4K can bring to the table - text and graphics are much sharper, and it makes a big difference to me.

For gaming - I honestly can't see a lot of difference. The higher your frame rate, the faster the motion on the screen - the less that the detail and resolution matter, as I'm focused more on the movement than I am the small details. Sure, if you take screen shots, you can drill down and notice those details again, but while the frame is moving, I honestly can't tell a whole lot of difference between a game played at 4K and 1080.

I have more issues with 4K on older games than anything. I don't regret having got a pair of 4K monitors, but at the same time, for what I do (a mix of gaming and productivity), they work out. If I had to do it over again, I would have split - one 4K for productivity and one 1440 higher frame rate for strictly gaming.

That being said, I'm not about to drop to 1440 from 4K. Also, I'll mention, I'm gaming on a GTX980, I play mostly MMOs and older games, the framerate hasn't been an issue for me and I'm willing to budge on settings as needed to keep framerate up - which so far hasn't been a lot of compromise, surprisingly it's worked out well still.
 
What a moron. Outside of competitive multiplayer games, I think this guy is way off. The people who aren't running 4K, don't want to pay for it, can't pay for it or haven't experienced it on a large enough display.
 
What a moron. Outside of competitive multiplayer games, I think this guy is way off. The people who aren't running 4K, don't want to pay for it, can't pay for it or haven't experienced it on a large enough display.
I'm in the "Poor white trash can't afford it right now and haven't experienced for myself yet" category :cautious:

But I'd sure like to ... someday
 
I fall under the I don't want to pay for it category. I'm happy with 1440P.
 
I'm in the "Poor white trash can't afford it right now and haven't experienced for myself yet" category :cautious:

But I'd sure like to ... someday
Been there. Took better part of twenty years before I could get to having the rigs I have now so I totally get it.

I fall under the I don't want to pay for it category. I'm happy with 1440P.

Nothing wrong with that. That's why I have two rigs. One is 1440p and the other is 4k. Before that it was 1080p and 1440p and so on. Each higher end rig ends up becoming the champ of a lower resolution as time goes by. TBH though I spent a lot of time last year looking for a 1440p t.v. w/ hdr to no real avail.

One thing I feel the need to say is that judgement of 4k can be really skewed if it's not experienced in a quality way. The diversity of quality for panels is sometimes beyond belief. Presently I have 4 different panels in the house and only one really sold me on the tech. Looking at things in stores is pretty much useless since there's no real guarantee how it will be until it's home, properly hooked up(right cables are a must), and calibrated. I don't necessarily mean calibrated by a professional since only you really know what you want out of it and sometimes that can take weeks to get used to. Having a pro do it can help but that's not always going to be the end of it.

1. HiSense 55" w/ HDR. At moments this can look good but HDR is only avaialbe thru an HDMI device that pushes it. Smart apps lost the ability after numerous updates. Tried it with my 4k rig and the quality wasn't really there. This was a cheap t.v. around 2+ years ago for $400

2. LG 55" non HDR with 3d. Our first 4k t.v. we got at Sam's around 3-4 years ago for ~!500. Overall not bad but each HDMI port has bizarre limitations between 4k/HDCP/color depths. Still use it and it will even display 4096x2160 4:4:4 8 bit but contrast levels are mixed.

3. LG 31.5" 4k monitor and my first, probably only 4k monitor. Native 4096x2160 true 10 bit color monitor. 60hz/non-hdr. Love it for clarity and colors and for a while was my favorite for gaming but doesn't even compare to the last on the list. Paid $1099 for that.

4. Sony Z9D 65" HDR w/ 3d. This has become the reference display in the house and is actually the only display now connected to my 4k rig. This one showed me how good 4k really could be(once I figured out all the calibrations). This is a 2016 model that they only stopped making back in Jan. 2019. Got one on sale for $1700 from $5000.
 
I have a 4K monitor and it kind of sucks for any fast paced shooters, but I don’t really care about those. I just lower resolution scale to 75-85% to get 60FPS in most games and go from there.
 
I have a 4K monitor and it kind of sucks for any fast paced shooters, but I don’t really care about those. I just lower resolution scale to 75-85% to get 60FPS in most games and go from there.

That was the thing for me. I mostly play shooters and 60Hz without G-Sync / FreeSync was getting annoying. I hated having to run V-Sync to avoid the screen tearing. Mostly, I just wasn't getting a good experience at 4K playing modern shooters. On the 4K TV, what I saw was certainly immersive, but it was just off.

I've always said that the main thing about monitors and gaming is that a better monitor won't make you better, but that's not entirely true. In multiplayer or even in single player, it certainly can. My game improved by switching to G-Sync / 120Hz etc. However, I still don't think a good monitor will make a bad player good or a bad monitor will make a good player suck. It helps, but it's only part of the equation.
 
How would 1080p look and work with a high refresh monitor, high pixel density monitor? I know to buy a high pixel density monitor means buying a 4k or similar monitor right? But if you game at 1080p on a higher end monitor, or nicer/less cheap monitor... wouldn't you get tons of the details, but better fps? I am asking for a friend.
 
How would 1080p look and work with a high refresh monitor, high pixel density monitor? I know to buy a high pixel density monitor means buying a 4k or similar monitor right? But if you game at 1080p on a higher end monitor, or nicer/less cheap monitor... wouldn't you get tons of the details, but better fps? I am asking for a friend.
At that point looking for something that is both high refresh and high contrast along with the closest to 100% Adobe/sRGB is going to show the most details. Color accuracy and contrast levels are often overlooked by many gamers but I've seen huge differences when comparing monitors of the same resolution and refresh rates but different contrast/color specs. My dream monitor would be something along the lines of a 32" 3440x1440/HDR1000/144hz/100% Adobe.
 
  • Like
Reactions: _k_
Become a Patron!
Back
Top