My parents are both in the far-right misinformation gutter on Youtube, Tiktok, Instagram, and maybe some other apps I don’t know about. I had the idea recently to try and surreptitiously deradicalize their algorithms by logging into their accounts and watching content that’ll steer them towards credible information, but I’m not entirely sure what the best approach is. I know that if their feeds suddenly become left-leaning, they’re going to notice something is up. They might even think the platform is trying to do a DEI on them, and may try switching to Truth Social.

Does anyone know if there are resources out there explaining how to do something like this? I’m sure I’m not the first person to have this idea. I don’t use any of these sites/apps myself, so I don’t feel like I know them enough to come up with a solid plan. I don’t wanna fuck it up lol

  • immortalluna@lemm.ee
    link
    fedilink
    arrow-up
    34
    ·
    4 days ago

    I did it with my grandmother’s Facebook a few years back by blocking most of the worst of the pages that were spewing that crap. Her feed got boring for her and she switched to reading erotica

  • Grimy@lemmy.world
    link
    fedilink
    arrow-up
    24
    ·
    edit-2
    5 days ago

    A lot of misinformation is served by ads so getting a pihole would go a long way I think.

    https://support.google.com/My-Ad-Center-Help/answer/12155451?hl=en

    You can customize the profile Google has on you, I’m not sure if it’s the same used by YouTube. YouTube might have a similar section and you can do the same with tiktok in the options somewhere. They will notice if you aren’t subtle. I would do it little by little over a few weeks.

  • dormedas@lemmy.dormedas.com
    link
    fedilink
    arrow-up
    17
    ·
    5 days ago

    I would doubt the efficacy of watching stuff on their account to try to shift the algorithm. For one, the algorithm appears to naturally select for controversial (and more likely right-leaning) content on its own when left unchecked and without significant history to the contrary. Secondly - and this is the one I can’t help with - your parents are selecting that content. The root problem is there. You can watch all the left-leaning stuff you want but they’re going to counteract it and the algorithm is going to back that up. To the algorithm, what’s more important, a small and recent interest in content entirely unrelated to what it’s accustomed to, or hundreds of hours of watch time on accounts the user is still subscribed to and also watches?

    • LemmyFeed@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      11
      ·
      5 days ago

      Exactly, without a change of viewing habits on their part, the algorithm will naturally just pull back to insaneo land.

      I have to constantly battle YouTube trying to serve me right wing trash.

      • ms264556@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        4 days ago

        Sooo this. I have a YouTube account with history and tracking turned off since the day I signed up. All of the promoted content is divisive hysteria, mostly alt-right fascist rage-bait.

    • Zachariah@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      4 days ago

      One thing kids like is to be tricked.

      For instance, I was going to take my little nephew to Disneyland, but instead I drove him to an old burned-out warehouse.

      “Oh, no,” I said. “Disneyland burned down.” He cried and cried, but I think that deep down, he thought it was a pretty good joke.

      I started to drive over to the real Disneyland, but it was getting pretty late.