• phobiac@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    Yeah the main lesson I’ve taken away from the last decade of cryptocurrency instability, NFTs, and things like algorithmically generated judicial sentencing guidelines that perpetuated the existing racial biases while making them seem more legitimate because “the computer can’t be wrong” is that we should run our whole society with them.

    • sturlabragason@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Sure.

      Algocracy uses algorithms to inform societal decisions, while Blockchain is a transparent, decentralized ledger system. People often confuse cryptocurrencies with the underlying Blockchain technology, even though they serve different purposes.

      Comparing the challenges of Algocracy to the volatility of cryptocurrencies is like assessing the potential of online commerce based on early internet connectivity issues.

      Biases in Algocracy are the result of poor design. With meticulous design and continuous oversight, the potential of Algocracy can be fully realized.

      • floofloof@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 year ago

        Biases in Algocracy are the result of poor design.

        You can’t design a neutral algorithm. The algorithm has to be designed to optimize something. What that thing is is a political and philosophical decision. Government by algorithm is indirect government by whoever’s values shaped the design of the algorithm.

        Algorithms can no doubt assist in regulating systems but they don’t resolve any of the deeper political issues about values, goals and what constitutes improvement.

        Of course, tech bros will claim they can sell you a neutral algorithm that will run things better than people, but that’s just because tech bros’ political philosophy is basically “just do it my way because obviously I’m smarter than you.” They won’t even notice how their algorithms are biased, because they’re not even interested in that question.