4K Gaming Is a “Useless” Gimmick Designed to Sell New Tech, Says Playdead Co-founder

How would 1080p look and work with a high refresh monitor, high pixel density monitor? I know to buy a high pixel density monitor means buying a 4k or similar monitor right? But if you game at 1080p on a higher end monitor, or nicer/less cheap monitor... wouldn't you get tons of the details, but better fps? I am asking for a friend.
I don’t have high refresh but I’ve often games at a lower resolution on a 4K. I haven’t compared it side by side with a 1080 monitor running at native resolution, but just comparing 1080 upscaled vs 4K native the native resolution is always notably sharper.

“Better” though depends on a lot of things - graphics settings and FPS and such play into that. Usually I upscale because some older game can’t scale the UI properly and everything is so small on 4K native it can’t be seen.
 
I have an old ASUS VG248 24" 1080p 144hz 1ms monitor (you used to be able to swap out the mainboard for a gsync compatable unit) ..I was fiddling around and saw that I could set the resolution to 2560x1440 from it's recommended 1920x1080. I changed it and it took the setting... :unsure:

Ok .. well .. let's fire up Hunt:Showdown .. Hunt comes up in a smaller window (1080p) ..hmm .. ok .. so I set Hunt to 2k resolution .. it takes it .. wait ..wha?!

Oh my goodness .. it's noticeably clearer .. my fps dropped by about 20, but I'm still over 60 with everything maxed out on my RX5700XT. I tried going back to 1080p and I just can't do it.. I'll take the fps hit for the sharper detail. I thought 1080p was pretty nice but man .. what a difference. Played some Destiny 2 with my daughter (she has a decent little gaming setup in her room) and the difference from 1080p to 2k was very noticeable ..

This is my first taste of 2k gaming .. and I'm only on a 24" monitor that I picked up from a buddy for $25 about a year ago... I can only imagine how nice 4k is on bigger setups .. 40" and above? Far from "useless" Gimmick

I don't know if I'm running true 2k resolution or not, but it sure does look way better/crisper
 
Last edited:
Ehhhh, you sure you're not using a "pseudo-resolution"? My Aorus AD27QD is 2560x1440 native, but will "simulate" 4K. It's not real 4k. Even my old QNIX QX2710 would simulate 4K.

I mean, if it works for you, that's cool. I think one of the down sides to those simulated resolutions is horrible response times.
 
Ehhhh, you sure you're not using a "pseudo-resolution"? My Aorus AD27QD is 2560x1440 native, but will "simulate" 4K. It's not real 4k. Even my old QNIX QX2710 would simulate 4K.

I mean, if it works for you, that's cool. I think one of the down sides to those simulated resolutions is horrible response times.
I don't think a 1080p monitor can automagically become a 2k monitor .. I just selected 2560x1440 in windows is all and gaming looks way crisper. When I initially launched Hunt:Showdown it even popped up in a smaller (1080p) window until I changed the settings to 2560x1440.. so I don't know. Gameplay feels identical ..it just looks way better. Like the AA is set to 10x instead of the ingame 2x. I had tried to force 8x in the Radeon software with 1080p and switching back to "let the game decide what AA it runs" and then running the display at 1440p looks way better/smoother/crisper ..etc

Whatever is actually going on .. I like it :oops:
 
I remember finding out my g500 would do 2048x1600, sometime in the early 00's; one of my buddies had just dumped a bunch of cash on a new monitor for 1024x768@75Hz, and looking up the specs.

I got the G500 for cheap, because it had a burnin pattern from a medical app on it, but I was able to get rid of it over a couple of months by hitting it with a UV light.

The video cads of the day were not happy with it, tho.
My TNT2 had serious issues with full resolution. :)

The HD7970 now using it does not, lol. It's biggest complaint is from the VGA adapter.
 
4k has been a gimmick for years in PC gaming unless you wanted to spend large amounts of cash to upgrade regularly to drive it. Most put their money elsewhere or can't afford it. 4k for actual work is fine and understandable. Given the gpu market the past 5 years I've not regretted staying at 1080p. I'd like to go to 1440p but given current gen cards I don't see a lot of longevity at 1080p let alone 1440p/2k for cards that now cost $350+ for decent framerates. 192bit bus cards with 6gb of ram for $250 that used to be 279-310 6 months ago is not awe inspiring and they're not really 1440p cards despite what some want to believe.

Also, what caused monitors to only have 2 inputs? Even some adaptive sync ones with just one display port? I'd like to hook up two systems to one screen and maybe a console too.
 
Become a Patron!
Back
Top