• roofuskit@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    6 months ago

    The algorithms are toxic. Negativity gets more engagement so the algorithms push that content more.

    • stoy@lemmy.zip
      link
      fedilink
      arrow-up
      3
      ·
      6 months ago

      This is too simple, groups of people radicalizing themselves is a very well known phenomenom, it has existed since humans started forming groups.

      A good example are the terrorist groups during the cold war, Baader-Meinhof, Japanese Red Army and similar.

      The groups may start as a group to work towards a new political system through peaceful means, then someone starts an informal competition about who is the “best” and more “pure” member, starting to subtly put other member’s down for not doing as much as they are.

      Then it becomes a feedback loop, and soon you have a group that condems their own initial goals as counter revolutionary, and constantly moving the goalposts.

      The algorithm itself doesn’t introduce this behaviour, but turbo chargers it by making people self radicalize mich faster by showing them an endless stream of people telling them how to be even “better” and even more “pure”.