Had this reflection that 144hz screens where the only type of screen I knew, that was not a multiple of 60. 60 Hz - 120hz - 240hz - 360hz

And in the middle 144hz Is there a reason why all follow this 60 rule, and if so, why is 144hz here

  • JustEnoughDucks@feddit.nl
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    1 year ago

    72 Hz was used as a refresh rate for CRT monitors back in the day. Specifically because it was the average threshold that no users reported discomfort from CRT flicker. And 72 * 2.

    It is likely a holdover from that era. I think from there, it is a multiple of 24 HZ so movie content scaled smoothly without tearing before vsync? Last part is a guess.

    • ZephrC@lemm.ee
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      Old reel projectors actually flashed their light at 72Hz. They had to turn off the light to move the reel to the next slide so you couln’t see the pictures moving up off the screen, and human eyes are better at spotting quickly flashing lights than they are at spotting microstuttery motion, so flashing the bulb once per frame at 24Hz in a dark room was headache inducing. The solution they came up with was just to flash the bulb 3 times per frame, which is 72Hz.

    • astraeus@programming.dev
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      144Hz is not a holdover in the case of computer monitors. It’s the maximum bandwidth you can push through DVI-D Dual-link at 1080p, which was the only standard that could support that refresh rate when they began producing LCD monitors built to run 144Hz.

        • TheRealKuni@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 year ago

          movies stuck with 24 which was good enough but close enough to all the others. They still use this framerate today which is a joke considering you can get 8K video in resolution but have frame rate of a lantern show from last century.

          “But when I saw The Hobbit with 48fps it looked so cheap and fake!”

          😑

            • TheRealKuni@lemmy.world
              link
              fedilink
              English
              arrow-up
              8
              ·
              1 year ago

              Yep! Not the only issue with it, but certainly one of them.

              We also have everyone associating smooth motion with soap operas because of cheap digital television cameras (IIRC).

              I like higher framerates. Sweeping shots and action scenes in 24fps can be so jarring when you’re used to videogames.

            • TheRealKuni@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Of course it did, Weta had no lead time at all. They had years for the original LotR trilogy. They were set up for failure.

              But unfortunately it ruined the industry perception of 48fps movies for years. To the point that when the new Avatar came out last year they were like "it’s 48fps but we promise we double up frames for some scenes so it’s only 24fps for those ones, don’t worry!”

      • smallaubergine@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        It’s actually 23.976 and yes it’s because of NTSC frame rates. But increasingly things are shot now at a flat 24p since we’re not as tied down to the NTSC framerate these days.

  • Laser@feddit.de
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    1 year ago

    The reason 60Hz was so prominent has to do with the power line frequency. Screens originated as cathode ray tube (CRT) TVs that were only able to use a single frequency, which was the one chosen by TV networks. They chose a the power line frequency because this minimizes flicker when recording light powered with the same frequency as the one you record with, and you want to play back in the same frequency for normal content.

    This however isn’t as important for modern monitors. You have other image sources than video content produced for TV which benefit from higher rates but don’t need to match a multiple of 60. So nowadays manufacturers go as high as their panels allow, my guess is 144 exists because that’s 6*24Hz (the latter being the “cinematic” frequency). My monitor for example is 75 Hz which is 1.5*50Hz, which is the European power line frequency, but the refresh rate is variable anyways, making it can match full multiples of content frequency dynamically if desired.

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Fun fact, quite a few monitors can be overclocked simply by creating a custom resolution. I have a 32" Thinkvision that officially only supports 1440p 60hz but it’s fine running at 70hz when asked to.

  • HubertManne@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    the numbers are a maximum and software can alter it lower or split it up. I worked in a visualization lab and we would often mess with the refresh rates. That being said you could alter it and the screen would not respond (show an image) so there must be some limitations.

      • Overzeetop@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Funny effect, though - many cheap electronics (think coffee makers and microwave ovens) use the line frequency as a time base. Taking a 60Hz or 50Hz appliance and plugging it into the other causes the clock to be off.

        • SanguinePar@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Huh, now that is interesting - our microwave’s clock continually edges forward until it’s a few minutes out from the oven clock right next to it. I wonder if that’s why. I’m in the UK and as far as I know, all our appliances are too, but maybe not?

          • Overzeetop@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            That’s probably just fluctuations in the line frequency and the method for keeping time varying between the two (one might use a crystal that drifts). Being on the “wrong” frequency will have it shift by hours every day. I had a (US/60Hz origin) microwave in my apartment in Bonaire (50Hz) last year that never seemed to have the right time, and when I did the math I realized it was the frequency - it was behind by ~4 extra hours every day (50/60 x 24 hours).

              • Overzeetop@kbin.social
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                Yeah, that’s just a shitty (or out of spec) time base. My Seiko watch gains 1-2 minutes a day, but it’s completely mechanical so it depends on temperature and winding/mechanism tension for accuracy. There are electronic timing circuits which are resistance and capacity based, and as the resistance and capacitance of the system drift (time/age and temperature) they also drift. A crystal, made to vibrate at high frequency (piezoelectrically, iirc), will provide a much more stable time base and be accurate to seconds over many days’ time.

                Interesting aside - time keeping is how ships at sea used to determine where they were in the ocean. Latitude can be found from the stars, but longitude can’t so it needs a time reference standard. The book, Longitude tells the story of the search and the competing methods for determining location prior to the invention of crystal/electronic time bases and modern GPS. I won’t say that the storytelling is particularly gripping, but the actual path to discovery is fascinating.

  • astraeus@programming.dev
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    60Hz was the original clock rate, determined by US power cycles way back in the day. This was 50Hz in some countries.

    With LCD screens, the potential for higher frame rates became easier to achieve. They began to advertise 120Hz TVs and monitors, which set a new bar for frame rates. Some advertise 75Hz monitors, slightly better than 60Hz when crunching numbers. 75Hz is achieved by overclocking standard 60Hz control boards, most can achieve this refresh rate if they allow it. Later HDMI standards, DisplayPort and DVI-D support this frame rate at least up to 2K.

    144Hz is the same trick as 75Hz, this time with a 120Hz control board. The true standard frame rate is 120Hz, it is clocked higher to achieve 144Hz. Why 144 exactly? This was most likely due to the lack of standards that originally supported higher frame rates. Dual-link DVI-D was the only one which could push 144Hz at 1080p. Any higher frame rate (or resolution) and the signal would exceed bandwidth. Now 144Hz is simply a new standard number and plenty of 1440p monitors are set to this frame rate.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Just to point out. I had 120hz on a CRT monitor back in the late 90s/early 2000s. The resolution was terrible though (either 640x480 or 800x600). At good resolutions (1024x768 or 1280x960) you were generally stuck with 75 to 90 at best.

      60hz LCD screens were one of the reasons there was resistance among game players to move to LCD. Not to mention earlier units took a VGA input and as such the picture quality was usually bad compared to CRT and added latency. People buying LCDs did it for the aesthetics when they first became available. Where I worked, for example only the reception had an LCD screen.

      Also, on a more pedantic point. 50hz is the power line frequency in the majority of the world.

      • Meho_Nohome@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        That proves that the USA is 10 better than the rest of the world.

        (Except American Samoa, Anguilla, Antigua, Aruba, Bahamas, Belize, Bermuda, Brazil, Canada, Cayman Islands, Colombia, Costa Rica, Cuba, Dominican Republic, Ecuador, El Salvador, Guam, Guatemala, Guyana, Haiti, Honduras, South Korea, Mexico, Micronesia, Montserrat Islands, Nicaragua, Okinawa, Palmyra Atoll, Panama, Peru, Philippines, Puerto Rico, St. Kitts & Nevis Islands, Saudi Arabia, Suriname, Tahiti, Taiwan, Trinidad & Tobago, Venezuela, Virgin Islands, and western Japan)

      • astraeus@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        On your pedantic point, I can’t argue. However, I can say 60Hz power cycles are what set in stone the 60Hz standard. This is in spite of the fact that a lot of countries didn’t even have 60Hz screens until screen controller clock rates were decoupled from power line frequencies.

    • lukewarmtuna@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Display refresh rates, measured in hertz (Hz), significantly impact the smoothness of motion on screens. A display with a 60Hz refresh rate updates the image 60 times per second, each update representing a frame. Thus, at its maximum capacity, a 60Hz display shows 60 frames in one second. In contrast, a 360Hz display updates its image 360 times per second, allowing for the potential display of 360 frames in the same duration. This rapid succession of frames results in a markedly smoother visual experience, as the human eye perceives motion more fluidly when more frames are displayed per second.

      Conversely, a display with a lower refresh rate, like 24Hz, refreshes the image just 24 times per second. This lower frequency results in a more ‘choppy’ or stuttered visual experience due to the fewer number of frames presented each second.

      Analogous to a film projector, increasing the frame rate for smoother motion requires the film to move faster through the projector. However, without additional frames in the source material, this would simply speed up the playback. To maintain normal playback speed while achieving a higher frame rate, the source material must contain more frames. For instance, to sustain standard playback speed on a 360Hz display (which is 6 times faster than a 60Hz display), the source needs to provide six times as many frames per second.

    • thenewred@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      The unit “hertz” means “per second”. A higher value is still one second, but more events per second.

    • BilboTBaggin@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      All of the frames in the number (30, 60, 144, 360, etc) are shown in one second. So for 360 Hz you’re seeing a new frame every 1/360 = 0.0028 seconds vs 1/60 = 0.017 seconds which gives a smoother transition from frame to frame