Had this reflection that 144hz screens where the only type of screen I knew, that was not a multiple of 60. 60 Hz - 120hz - 240hz - 360hz

And in the middle 144hz Is there a reason why all follow this 60 rule, and if so, why is 144hz here

  • MxM111@kbin.social
    link
    fedilink
    arrow-up
    27
    arrow-down
    2
    ·
    edit-2
    1 year ago

    I have 160Hz screen.

    Also, 144/24=6. 24fps is the original fps of the movies. So, 160 is more puzzling from this perspective. It is not divisible by 24 or 30.

  • JustEnoughDucks@feddit.nl
    link
    fedilink
    arrow-up
    21
    arrow-down
    2
    ·
    edit-2
    1 year ago

    72 Hz was used as a refresh rate for CRT monitors back in the day. Specifically because it was the average threshold that no users reported discomfort from CRT flicker. And 72 * 2.

    It is likely a holdover from that era. I think from there is was a multiple of 24 HZ so movie content scaled smoothly without tearing before vsync? Last part is a guess.

    • ZephrC@lemm.ee
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      Old reel projectors actually flashed their light at 72Hz. They had to turn off the light to move the reel to the next slide so you couln’t see the pictures moving up off the screen, and human eyes are better at spotting quickly flashing lights than they are at spotting microstuttery motion, so flashing the bulb once per frame at 24Hz in a dark room was headache inducing. The solution they came up with was just to flash the bulb 3 times per frame, which is 72Hz.

    • astraeus@programming.dev
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      144Hz is not a holdover in the case of computer monitors. It’s the maximum bandwidth you can push through DVI-D Dual-link at 1080p, which was the only standard that could support that refresh rate when they began producing LCD monitors built to run 144Hz.

  • Laser@feddit.de
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    1 year ago

    The reason 60Hz was so prominent has to do with the power line frequency. Screens originated as cathode ray tube (CRT) TVs that were only able to use a single frequency, which was the one chosen by TV networks. They chose a the power line frequency because this minimizes flicker when recording light powered with the same frequency as the one you record with, and you want to play back in the same frequency for normal content.

    This however isn’t as important for modern monitors. You have other image sources than video content produced for TV which benefit from higher rates but don’t need to match a multiple of 60. So nowadays manufacturers go as high as their panels allow, my guess is 144 exists because that’s 6*24Hz (the latter being the “cinematic” frequency). My monitor for example is 75 Hz which is 1.5*50Hz, which is the European power line frequency, but the refresh rate is variable anyways, making it can match full multiples of content frequency dynamically if desired.

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Fun fact, quite a few monitors can be overclocked simply by creating a custom resolution. I have a 32" Thinkvision that officially only supports 1440p 60hz but it’s fine running at 70hz when asked to.

  • HubertManne@kbin.social
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    the numbers are a maximum and software can alter it lower or split it up. I worked in a visualization lab and we would often mess with the refresh rates. That being said you could alter it and the screen would not respond (show an image) so there must be some limitations.

  • astraeus@programming.dev
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    60Hz was the original clock rate, determined by US power cycles way back in the day. This was 50Hz in some countries.

    With LCD screens, the potential for higher frame rates became easier to achieve. They began to advertise 120Hz TVs and monitors, which set a new bar for frame rates. Some advertise 75Hz monitors, slightly better than 60Hz when crunching numbers. 75Hz is achieved by overclocking standard 60Hz control boards, most can achieve this refresh rate if they allow it. Later HDMI standards, DisplayPort and DVI-D support this frame rate at least up to 2K.

    144Hz is the same trick as 75Hz, this time with a 120Hz control board. The true standard frame rate is 120Hz, it is clocked higher to achieve 144Hz. Why 144 exactly? This was most likely due to the lack of standards that originally supported higher frame rates. Dual-link DVI-D was the only one which could push 144Hz at 1080p. Any higher frame rate (or resolution) and the signal would exceed bandwidth. Now 144Hz is simply a new standard number and plenty of 1440p monitors are set to this frame rate.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Just to point out. I had 120hz on a CRT monitor back in the late 90s/early 2000s. The resolution was terrible though (either 640x480 or 800x600). At good resolutions (1024x768 or 1280x960) you were generally stuck with 75 to 90 at best.

      60hz LCD screens were one of the reasons there was resistance among game players to move to LCD. Not to mention earlier units took a VGA input and as such the picture quality was usually bad compared to CRT and added latency. People buying LCDs did it for the aesthetics when they first became available. Where I worked, for example only the reception had an LCD screen.

      Also, on a more pedantic point. 50hz is the power line frequency in the majority of the world.

      • Meho_Nohome@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        1 year ago

        That proves that the USA is 10 better than the rest of the world.

        (Except American Samoa, Anguilla, Antigua, Aruba, Bahamas, Belize, Bermuda, Brazil, Canada, Cayman Islands, Colombia, Costa Rica, Cuba, Dominican Republic, Ecuador, El Salvador, Guam, Guatemala, Guyana, Haiti, Honduras, South Korea, Mexico, Micronesia, Montserrat Islands, Nicaragua, Okinawa, Palmyra Atoll, Panama, Peru, Philippines, Puerto Rico, St. Kitts & Nevis Islands, Saudi Arabia, Suriname, Tahiti, Taiwan, Trinidad & Tobago, Venezuela, Virgin Islands, and western Japan)

  • MeanEYE@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    9
    ·
    1 year ago

    Wait until you find out why 24.9 was a standard and still is for most of the movies. Logical at the time, completely retarded today.