Image/photo

  • Sekoia
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    In C, 0 is cold, 10 is mild, 20 is comfy, 30 is hot (european climate). And sure, Farenheit is more granular. But you can’t actually measure down to the Farenheit (say in weather forecasts, but even in a thermometer it’s iffy), so that granularity is useless (and in fact adds noise). Also, having “negative numbers might freeze” is really convenient for weather.

    • Tarcion@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I can happily confirm the granularity is not useless. I can definitely tell when my house is 73 and when it is 74. And I live where it is very hot. There is a very noticeable difference between like 89 and 91. The actual degrees of difference are pretty useful. The vast majority of terrestial weather also fits nicely into this very simple 0-100 typical range, and you can still easily summarize by describing weather in the 50’s or the 80’s, for example. It’s just better imo.

      • Sekoia
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        We don’t have precise temperature control anyway, so moot point here. But does being able to distinguish 1 degree change anything? If you’re warm at 74, you’ll still be warm at 73. You also won’t be able to notice the difference outside your home either.

        And that “very noticeable difference” happens to be… 1.1°C. So 1 °C is a noticeable difference.

        You can easily summarize in C by rounding to 5s. It does not take much if any time to do that. That summarizing is also not that useful if “30s” might be the difference between “roads are icy” and “just cold”.

        Finally, actual temp is only a part of “weather”. Wind and cloud coverage have a massive impact on how to dress and how hot/cold it feels