652
There's simply no going back
(startrek.website)
!gaming is a community for gaming noobs through gaming aficionados. Unlike !games, we don’t take ourselves quite as serious. Shitposts and memes are welcome.
1. Keep it civil.
Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only.
2. No sexism, racism, homophobia, transphobia or any other flavor of bigotry.
I should not need to explain this one.
3. No bots, spam or self-promotion.
Only approved bots, which follow the guidelines for bots set by the instance, are allowed.
4. Try not to repost anything posted within the past month.
Beyond that, go for it. Not everyone is on every site all the time.
Logo uses joystick by liftarn
All you FPS kids are just doing the new version of "eww that game has 2d graphics; polygons or bust!!" from the PlayStation era.
Yes, progress is cool and good, but no it's not the end-all be-all and no not every game has to have bleeding edge FPS to be good.
Like, we're literally already done this shit guys; can't we just learn from the past?
My brother or sister in pixels, this is not the same. I’m not a graphics snob. I still play pixelated, barely discernible nonsense games. When I updated from 30 to 144, it was a whole new world. Now even 60 can feel sluggish. This is not a graphical fidelity argument. It’s input and response time and motion perception. Open your mind, man. Accept the frames.
And that matters for certain games, a lot. But it doesn't functionally matter at all for others. Same as the transition to polygons. My point, which I thought I stated clearly, was not "FPS BAD!!", it was "FPS generally good, but stop acting like it's the single most important factor in modern gaming."
Simply put, if everything was 144fps then it would be easier on the eyes and motions would feel more natural. Even if it's just navigating menus in a pixel style game.
Real life has infinite frames per second. In a world where high fps gaming becomes the norm, a low 24 fps game could be a great art style and win awards for its 'bold art direction'.
Not really. Real life is as many FPS as your eyes can perceive, which is about 60 (though it can vary somewhat between people). See: https://www.healthline.com/health/human-eye-fps#how-many-fps-do-people-see
That article states people can perceive images as rapidly as once every 13 milliseconds, which they math out to 75 fps, 25% higher than 60.
Looking at the study itself, they were testing whether participants could pick out a picture that displayed for 13-80 ms when “masked” by other brief pictures, with a focus on whether it made a difference if the participant was told what image they were looking for before or after seeing the images. What they found was that participants could pick out the image as low as the 13 ms mark (albeit with less accuracy) and could generally do so better if told what to look for beforehand.
What this tells me is that your source has nothing to say about anything over 75 fps. It also was testing in a fundamentally different environment than a video game, where your brain will constantly expect an image similar to and stemming from the image before it rather than seeing a completely different image. If you were to draw conclusions based on the study despite the differences, what the study would suggest is that knowing what to look for, as your brain does gaming, would make you better able to pick out individual frames. This makes me want to think that your source does not support your assertion, and that in a game you could perceive frame rates higher than 75 fps at a minimum.
From my own knowledge, there’s also a fundamental difference between perceiving reality and computer screens in the form of motion blur. Objects moving in real time will leave a faint blur behind when perceiving it that your brain can use to fill in any blanks it may have missed, making reality appear smoother than it is. For an example of this wobble a pencil back and forth to make it “bend.” Movies filmed at 24 fps capture this minute motion blur as they film which makes it easier for our brains to watch them despite the lower frame rate. Real time rendered video games do not have this effect, as there are no after images to fill in the blanks (unless you turn on motion blur, which doesn’t do a good job emulating this).
This means video games need to compensate, and the best way to do that is more frames per second so your brain doesn’t need to fill in the blanks with the motion blur it’s used to seeing in the real world. You’ll obviously get diminishing returns from the same increase, but there will still be returns.
Yeah, as much as I can give a shit about ray tracing or better shadows or whatever, as a budget gamer, frame rate is really fucking me up. I have a very low end PC so 60 is basically max. Moving back to 30 on the PS4 honestly feels like I'm playing PS2. I had the [mis]fortune of hanging out at a friends house and playing his PC rig with a 40 series card, 240hz monitor, etc, and suffice it to say it took a few days before I could get back to playing on my shit without everything feeling broken.
That's more or less the placebo effect at work, though. Most people cannot see "faster" than 60FPS; the only actual upside of running higher FPS rate is that you don't go below 60 in case the game starts to lag for whatever reason. Now, you may be one of the few who actually see perceive changes better than normal, but for the vast majority, it's more or less just placebo.
You can literally see the difference between 60 and 144 when moving the cursor or a window on your desktop. What are you on about
That's just wrong. I couldn't go back to my 60Hz phone after getting a 120Hz new one. It's far from placebo, and saying otherwise is demonstrably false.