Is more FPS better FPS?

Grimlakin

Forum Posting Supreme
Joined
Jun 24, 2019
Messages
8,944
Points
113
So I've been around a long time in computer gaming. First exposure being playing wing commander on a 286, the discovery of what a sound card could do making a game better... through umpteen iterations of gaming experiences to where we are today with a video card as expensive as a full computer with Monitor, keyboard, and mouse back in the day. Heck even today considering...

and I have to wonder at it. I'm good with games that need to push a video card to hit around 100FPS at my chose 1440p. What I don't get are people driving their video cards to push out 1-2 even 4x the frames their monitor can actually support.

Is there an actual in game benefit or experience benefit to pushing 400 fps in an older less detailed game on a monitor that can only handle 150fps as an example?

Personally If I find games driving unlimited FPS in menus and such I put on a frame limiter at the driver level. In observation of games in recent years it's those menus and simple experiences that just run all out that destroy video cards. so for me I set my limiter to the supported max sync rate of my display. Trusting that I'm getting an optimized experience in game.

Yet I get the gut feeling that MANY people are drive far more frames generated than their display can output. So do those frames count? Is there an tangible benefit to pushing more than your display can show you?
 
Yet I get the gut feeling that MANY people are drive far more frames generated than their display can output. So do those frames count? Is there an tangible benefit to pushing more than your display can show you?
This depends on how the game is set up. Higher framerates can, for instance, decrease game world input lag, even though most of those frames are either partially displayed as 'tears' (i.e., four or five 'torn' frames making up one output frame on the monitor) or dropped as in V-Sync.

We also see monitors supporting over 500Hz+ refresh rates, and these do have a benefit when they can be driven that fast, which current hardware can do with a lot of competitive MOBA / FPS games. And it's not about being able to perceive changes in 1/500th of a second or less, but rather the overall smoothness and minimized reaction times that running so fast can enable.

Of course, you have to be playing those games, competitively, for that to matter. Most of us I'd think would rather have something in the 120Hz+ range with higher quality all around.
 
If the monitor can handle it, there absolutely can be a visual benefit based on some A/B difference demos that I've done at CES a couple of times. The two main ones that stood out were -

1. Overwatch - demo was showing fast side to side scrolling in the world. It was visibly smoother on the stupid high FPS monitor.
2. Counterstrike - These were demos where you aim down a door opening (small one) and try to hit someone that's running across. Reflex tends to be better on the higher refresh rate ones, though, some of the improvement is probably from.. practicing on the slow monitor first...
 
On those old drunk test reaction timers back when I was in highschool my reaction time was 1/2 a second. I wonder what they would be today... off to the web! According to that quick test on a 144hz refresh rate monitor my reaction time is about 180-165 ms. So .18 seconds give or take.

I wonder what reaction times pro gamers have?

Here is the URL I tested on if you're curious.

https://humanbenchmark.com/tests/reactiontime
 
I’ve had my driver set to frame limit at 140Hz for a long time. The advice I was using may be outdated now though - I seem to recall you wanted to be just a hair under your refresh rate for GSync
 
When CRTs were a thing, each frame lingered a bit due to the photon decay, there is a natural motion blur built in, but with the way LCDs/OLEDs work, the entire screen is lit at once, and then goes black/off, so there is a greater pause between frames, and no motion blur, so higher framerate is needed to actually make games feel smoother and look better, and have less input lag, from an image quality perspective as it relates to the way the display draws everything, so yes, for games, the higher the framerate, it can actually make the game both feel smoother and look better. Therefore the need isn't to go beyond the refresh rate of your monitor, but actually, the need is for higher refresh rate monitors to exist, like 500Hz-1000Hz, and then running games at those refresh rates/framerate, they will look really good and feel smoother.
 
So overall you don't want/need framerates higher than your supported monitor framerate?
 
So overall you don't want/need framerates higher than your supported monitor framerate?

That's my opinion

If you had a 144Hz display, I'd want all my games to constantly provide 144 FPS, without dips or changes in the consistency, in an ideal world.

And I'd argue that you want at least a 120Hz display for gaming today.

Now, I have a 240Hz display, and of course I don't achieve those framerates, cause I prefer image quality settings turned to max, with Ray Tracing, so I'm not hitting that, so /shrug 🤷‍♂️
 
Last edited:
That's my opinion

If you had a 144Hz display, I'd want all my games to constantly provide 144 FPS, without dips or changes in the consistency, in an ideal world.

And I'd argue that you want at least a 120Hz display for gaming today.

Now, I have a 240Hz display, and of course I don't achieve those framerates, cause I prefer image quality settings turned to max, with Ray Tracing, so I'm not hitting that, so /shrug 🤷‍♂️
I just think in looking at data and what people are sharing coming across that this pursuit of the next highest frame number has been really driving the hardware industry. (Much to the enjoyment of AMD and Intel and Nvidia and others.) Yet we have fallen away from the quality aspect of it all.

Yes my system will not peg every game I have today at 144 fps (or pick your monitor's max refresh rate) for my displays native resolution (144fps). But it is very good for what I play and how I play. (everything maxed out as you are Brent.... generally with upscaling tech disabled.)

I've realized I need to read into the articles/forums to see what the actual resolution being tested/reviewed/commented on is. Because most of the time if it's 4k gaming WITH DLSS/FSR/XeSS turned on... I'm pretty comfortable saying.. so 1440p. Great!

Kind of why I posted this article. I think it's important to those who haven't been in DIY gaming for an extended period of time to understand where some of us 'old' gamers come from in understanding our rigs. We know what our systems can do and can't do and for the vast majority of us. We enjoy the build/design as much as if not MORE than the actual games. The games are much of the time, a fun way for us to validate the performance of our builds.
 
Become a Patron!
Back
Top