• Kairos@lemmy.today
    link
    fedilink
    arrow-up
    10
    ·
    8 months ago

    No the correct way is to use the proper fucking metric standard. Use Mi or Gi if you need it. We have computers that can divide large numbers now. We don’t need bit shifting.

    • nybble41@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      8 months ago

      The metric standard is to measure information in bits.

      Bytes are a non-metric unit. Not a power-of-ten multiple of the metric base unit for information, the bit.

      If you’re writing “1 million bytes” and not “8 million bits” then you’re not using metric.

      If you aren’t using metric then the metric prefix definitions don’t apply.

      There is plenty of precedent for the prefixes used in metric to refer to something other than an exact power of 1000 when not combined with a metric base unit. A microcomputer is not one one-thousandth of a computer. One thousand microscopes do not add up to one scope. Megastructures are not exactly one million times the size of ordinary structures. Etc.

      Finally: This isn’t primarily about bit shifting, it’s about computers being based on binary representation and the fact that memory addresses are stored and communicated using whole numbers of bits, which naturally leads to memory sizes (for entire memory devices or smaller structures) which are powers of two. Though the fact that no one is going to do something as idiotic as introducing an expensive and completely unnecessary division by a power of ten for every memory access just so you can have 1000-byte MMU pages rather than 4096 also plays a part.

      • Kairos@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        If you aren’t using metric then the metric prefix definitions don’t apply.

        Yes it does wtf?

      • firefly@neon.nightbulb.net
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        The metric system is fascist. It was invented by aristocratic elitist control freaks. It is arbitrary and totalitarian.

        https://archive.ph/EB5Qu

        “The colorfulness and descriptiveness of the imperial system is due to the fact that it is rooted in imagery and analogies that make intuitive sense.”

        I’ll save my own rant until after I’ve seen the zombies froth.

        • firefly@neon.nightbulb.net
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          The meter is an French fascist measurement made by the court jester.

          “Since 2019 the metre has been defined as the length of the path travelled by light in vacuum during a time interval of
          1/299792458 of a second …” [Wikipedia]

          What is wrong with this definition?

          The metre claims to be a ‘non-imperial’ basis of measurement.

          But the basis of the metre is the imperial or ephemeral second, which is the ultimate imperial measurement. Seconds are an imperial unit. The measurement of time is fundamental to the ruler … get it?

          So the arbitrarily devised metre is founded upon the imperial second. Oops. Now why again did you say the metric system is ‘superior’ to the imperial system?

          Metric supremacists are fascist rubes who don’t realize they were pwnd by the empire before their rebellion even had a name or gang sign. They wanted to overthrow the king and based their coup on the king’s fundamental unit of regal measurement: time. Oops. This is a case of killing the baby in the cradle.

          Imperial units of measurement are based upon things found in nature. The second is a division of the solar and astronomical day. A second is 1/86400th of a day, and is based again on sexigesimal math, which is found EVERYWHERE in nature.

          Every good programmer should already know where this is going.

          Day: 86400 seconds.
          Day: 24 hours.
          Hour: 3600 seconds.
          Hour: minute squared.
          Minute: 60 seconds.
          3600 seconds * 24 hours = 86400 seconds.
          60 seconds * 60 minutes = 3600 seconds = 1 hour

          There is nothing arbitrary about this. The imperial measurement is neatly aligned to solar and astronomical cycles and to the latitudes and longitudes of the earth. In short, the imperial system of measurement had already measured the equatorial and tropical circuits of the earth and the sun’s path over 3000 years ago, and based measurements upon that.

          Then along came the metric aristocrats, who pretended this had never been done before, speculated a _false_ circumference of the earth, and came up with a flawed metre based on that false measurement, then changed it decades later to the distance traveled by light in an imperial second, unaware that no constant speed of light has yet been proved conclusively, but only assumed.

          Whereas the imperial system is based upon measurements which have been observed unchanged, verifiable, and reproducible, FOR THOUSANDS OF YEARS.

          Tell me again why the metric system is, ‘superior’?

          The metre is merely a speculation and the so-called speed of light has NOT been conclusively proven, considering special relativity and all that other aristocratic bollocks. Also complicating the matter is the specific definition of, “light traveling in a vacuum.” OK, sparky, how are you going to locate a laboratory in a vacuum at least 1 light second in length to conduct this experimental measurement and prove it?

          This fallacy is called an ‘unfalsifiable’ claim. Yup, The metric system is based upon a pseudo-scientific conjecture and fallacy. Whereas the imperial system is based upon thousands of years of repeatable observation. And yet ‘scientists’ somehow are confused about the reality of the situation.

          As I’ve said elsewhere, worldwide science and academia have been growing progressively more delusional for the past couple of centuries.

          In the end the aristocrats will bow to the king they hate. Thank God Americans have refused to bow to this dumb idol. Stay strong Murrikanz.

          Here’s a shout out to the limeys who still weigh in stones! Long live the king’s foot!

          academicchatter@a.gup.pe

    • PowerCrazy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Hey how is “bit shifting” different then division? (The answer may surprise you).

        • PowerCrazy@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          interesting, so does the computer have a special “base 10” ALU that somehow implements division without bit shifting?

          • nybble41@programming.dev
            link
            fedilink
            arrow-up
            3
            ·
            8 months ago

            In general integer division is implemented using a form of long division, in binary. There is no base-10 arithmetic involved. It’s a relatively expensive operation which usually requires multiple clock cycles to complete, whereas dividing by a power of two (“bit shifting”) is trivial and can be done in hardware simply by routing the signals appropriately, without any logic gates.

            • PowerCrazy@lemmy.ml
              link
              fedilink
              English
              arrow-up
              2
              ·
              8 months ago

              In general integer division is implemented using a form of long division, in binary.

              The point of my comment is that division in binary IS bitshifting. There is no other way to do it if you want the real answer. You can estimate, you can round, but the computational method of division is done via bitshifting of binarary expansions of numbers in an ALU.

    • ursakhiin@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      This is such a weird take to me. We don’t even colloquially discuss computer storage in terms of 1000.

      The Greek terms were used from the beginning of computing and the new terms of kibi and mebi (etc.) were only added in 1998 when Members it the IEC got upset. But despite that, most personal computers still report in the binary way. The decimal is only used on boxes for marketing terms.