• CH3DD4R_G0B-L1N@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    3
    ·
    11 months ago

    My brother or sister in pixels, this is not the same. I’m not a graphics snob. I still play pixelated, barely discernible nonsense games. When I updated from 30 to 144, it was a whole new world. Now even 60 can feel sluggish. This is not a graphical fidelity argument. It’s input and response time and motion perception. Open your mind, man. Accept the frames.

    • Vespair@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      5
      ·
      11 months ago

      And that matters for certain games, a lot. But it doesn’t functionally matter at all for others. Same as the transition to polygons. My point, which I thought I stated clearly, was not “FPS BAD!!”, it was “FPS generally good, but stop acting like it’s the single most important factor in modern gaming.”

      • And009@reddthat.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        11 months ago

        Simply put, if everything was 144fps then it would be easier on the eyes and motions would feel more natural. Even if it’s just navigating menus in a pixel style game.

        Real life has infinite frames per second. In a world where high fps gaming becomes the norm, a low 24 fps game could be a great art style and win awards for its ‘bold art direction’.

          • jaycifer@kbin.social
            link
            fedilink
            arrow-up
            4
            ·
            11 months ago

            That article states people can perceive images as rapidly as once every 13 milliseconds, which they math out to 75 fps, 25% higher than 60.

            Looking at the study itself, they were testing whether participants could pick out a picture that displayed for 13-80 ms when “masked” by other brief pictures, with a focus on whether it made a difference if the participant was told what image they were looking for before or after seeing the images. What they found was that participants could pick out the image as low as the 13 ms mark (albeit with less accuracy) and could generally do so better if told what to look for beforehand.

            What this tells me is that your source has nothing to say about anything over 75 fps. It also was testing in a fundamentally different environment than a video game, where your brain will constantly expect an image similar to and stemming from the image before it rather than seeing a completely different image. If you were to draw conclusions based on the study despite the differences, what the study would suggest is that knowing what to look for, as your brain does gaming, would make you better able to pick out individual frames. This makes me want to think that your source does not support your assertion, and that in a game you could perceive frame rates higher than 75 fps at a minimum.

            From my own knowledge, there’s also a fundamental difference between perceiving reality and computer screens in the form of motion blur. Objects moving in real time will leave a faint blur behind when perceiving it that your brain can use to fill in any blanks it may have missed, making reality appear smoother than it is. For an example of this wobble a pencil back and forth to make it “bend.” Movies filmed at 24 fps capture this minute motion blur as they film which makes it easier for our brains to watch them despite the lower frame rate. Real time rendered video games do not have this effect, as there are no after images to fill in the blanks (unless you turn on motion blur, which doesn’t do a good job emulating this).

            This means video games need to compensate, and the best way to do that is more frames per second so your brain doesn’t need to fill in the blanks with the motion blur it’s used to seeing in the real world. You’ll obviously get diminishing returns from the same increase, but there will still be returns.

    • whofearsthenight@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      Yeah, as much as I can give a shit about ray tracing or better shadows or whatever, as a budget gamer, frame rate is really fucking me up. I have a very low end PC so 60 is basically max. Moving back to 30 on the PS4 honestly feels like I’m playing PS2. I had the [mis]fortune of hanging out at a friends house and playing his PC rig with a 40 series card, 240hz monitor, etc, and suffice it to say it took a few days before I could get back to playing on my shit without everything feeling broken.

    • nevemsenki@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      8
      ·
      11 months ago

      Now even 60 can feel sluggish.

      That’s more or less the placebo effect at work, though. Most people cannot see “faster” than 60FPS; the only actual upside of running higher FPS rate is that you don’t go below 60 in case the game starts to lag for whatever reason. Now, you may be one of the few who actually see perceive changes better than normal, but for the vast majority, it’s more or less just placebo.

      • V0lD@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        11 months ago

        That’s more or less the placebo effect at work, though. Most people cannot see “faster” than 60FPS;

        You can literally see the difference between 60 and 144 when moving the cursor or a window on your desktop. What are you on about

      • CommanderCloon@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        That’s just wrong. I couldn’t go back to my 60Hz phone after getting a 120Hz new one. It’s far from placebo, and saying otherwise is demonstrably false.