Why Are High FPS So Important for Gaming?

I've been researching gaming PCs for a few months to upgrade from PS4, and I noticed there is a big emphasis on fps, usually aiming to get above 60fps. Most of the games I've played on PS4 are 30fps, although maybe a few are 60fps. They all feel super-smooth, and I've never thought I needed higher fps. I also watch a lot of YouTube in 30fps, occasionally 60fps, and it also looks very smooth.

Maybe this is based on experience, but would you think 60fps is high enough for most games, or do you think you notice a big difference above 60fps?

Comments

    • All you have to do is move your mouse back and forth rapidly in Windows and notice the gaps in-between the mouse pointer arrows to know that fps really affects your target tracking. The gaps get smaller when you increase your refresh rate but they are still pretty disjointed. When we have 1000 hz monitors and video cards I think this will be the sweet spot for gaming. Only thing is you will need to mortgage your house to buy the NVIDIA 9090 Ti.

    • Frame rate as nothing to do with audio latency, audio buffer size does (but games don't give you control of it).

  • For me I'm so used to 120Hz from phone usage and gaming over years that going to 60Hz/30Hz is very noticeable and gives a headache when transitioning to it. The main thing that gets me is fast camera movements or scrolling on a page, I can see the animation skipping along.

    If I transition to 60Hz after awhile i can adjust to it (Eg. 60 fps capped games), but really do prefer 120Hz and would mod a game to allow for it where possible, eg. Elden Ring.

  • Apart from improving reaction time, the less info your brain has to smooth out, the less fatigue, headaches and motion sickness.

  • Any mildly competitive gamer over the age of 30 would have had exposure to high refresh rate CRT monitors in the old days (early 2000’s) and once your eyes are trained to that it literally makes your eyes water and gives you a headache when you go back to 60hz.

    As a middle aged very casual gamer now I don’t even want high refresh rate for the competitive edge, I want it because I like it and I’m used to it.

    I updated my work laptop to a 120hz display simply because it was the only device I used that wasn’t HRR and it was annoying me.

  • 30 FPS is literally unplayable in so many games…
    I use 300hz at home, but most games can’t do that kinda FPS at 1440P.
    I think for me >90FPS things exponentially lose their noticeability but other games I can tell and the higher FPS becomes more important occasionally netting me a victory royale over an 8 year old on a 60FPS screen

  • -1

    You see this drama more within PC gaming because there are so many variables like Windows itself, AV, processes and you name it while on a console, 30fps means 30fps, 60 fps means 60 fps.
    You won't notice much difference between 30 and 60fps but you will notice a big difference jumping to 120fps.

    Even if you say you cannot see it, play a shooting game like Call of Duty in 60fps and then 120fps, you will die easily 50% less and you will get 50% more kills because of the extra frames.

    Maybe this is based on experience, but would you think 60fps is high enough for most games, or do you think you notice a big difference above 60fps?

    For general games, 60fps is plenty but if we are talking about shooting games like Call of Duty, Battlefield and alike, you do wanna 120fps at least or you are dead meat.
    I play my XBox Series X on my 65" QLED TV, at 4K 60fps I get destroyed but at 1440p 120fps, I get PC players mad with "aim assisted" which I have disabled haha

    • This is a pretty simplistic view. Stable frame rates are achievable on PC (even with background processes running) provided you're well within your hardware spec.

      And I'm pretty sus on doubling your frame rate equating to a doubling of your K:D. 120 is much nicer to play that 60, sure, but there is only so much your brain can process and react to in a second. Generally you have to be pretty close to the skill ceiling of a game for frame/refresh rate (provided it's not terrible) to matter to your actual performance. Most gamers are casual gamers, and it's not going to matter the other competitive performance whether they're playing at 60 or 600FPS.

    • Won't notice much difference between 30 and 60? I remember when Gta3 came out and taking forever to get used to the stuttering due to the low frame rate. Theres a huuuuge difference been the two.

  • 30fps = 1f per 33ms

    33ms is average playable latency. I assume 30fps is the minimum to cover all the bits you don't want to miss.

  • "Maybe this is based on experience, but would you think 60fps is high enough for most games, or do you think you notice a big difference above 60fps?"

    1. you do not play "most games", you play a very limited selection of games. some games are so casual they simply don't need high refresh rate
    2. Maybe this is based on experience? definitely, I can no longer go back to 60 for competitive games, and dislike the 60fps desktop (just office and web browsing) experience.
    3. would you think 60fps is high enough for most games? I think having 60fps as a standard is arbitrary, based on so many historical factors that are less relevant now. Ideally everyone gets as much fps as possible for as cheap as possible.
    4. do you think you notice a big difference above 60fps? yes, even good 60hz display panels vs dogshit 120hz panels.

    video and (competitive) games are not the same, video just plays and usually has no lag no jitter or any image problems that games has.

    we're on ozbargain so, I'd say 120hz+ is common and cheap nowadays, you should get it where you can, in TVs, phones and monitors. Many TV interpolate frames to higher framerate because the display panel is 120hz.
    there are studies (blur busters) that shows human vision is capable of perceiving at least up to 1000hz.

  • As a 40 year old casual gamer, i can tell you go for the highest refresh you can afford without blowing your budget.

    A few years ago i went from a 60hz monitor to a 75hz and i can 100% tell the difference in just those 15hz. I'm now looking at 144hz as i don't think my gpu could get much higher at 1440p for the games i play. 30fps is unplayable to me, and 60 is minimum.

    Some people turn the graphics way down to get higher fps. Stuff that! I want it to look pretty and smooth at the same time.

  • Depends on the game.

    Typically 60fps is fine for 99% of games.

    Even 30fps is fine for some games.

    People want more than 60 for some competitive games.

    It just depends.

    For movies, tv, animation, you actually generally want a low frame rate.

    Animation night need a low frame rate like 12 or even less fps, movies look awful at anything over 24 fps

    It's interesting

  • Theyre not, unless youre playing competitive and every small advantage matters.

    I have a 144hz Monitor and have no problem setting my console to Picture mode (or whatever game calls the opposite of performance mode) and playin at 30fps.

Login or Join to leave a comment