TikTok has to face a lawsuit from the mother of 10-year-old Nylah Anderson, who “unintentionally hanged herself” after watching videos of the so-called blackout challenge on her algorithmically curated For You Page (FYP). The “challenge,” according to the suit, encouraged viewers to “choke themselves until passing out.”

TikTok’s algorithmic recommendations on the FYP constitute the platform’s own speech, according to the Third Circuit court of appeals. That means it’s something TikTok can be held accountable for in court. Tech platforms are typically protected by a legal shield known as Section 230, which prevents them from being sued over their users’ posts, and a lower court had initially dismissed the suit on those grounds.

  • stardust@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    Uhhh… I don’t think you got my point for why I also included Facebook and Twitter at the end as examples of domestic companies also willingly allowing harmful societal trends.

    Money being a reason doesn’t absolve and provide a convient out and let companies do whatever they want without consequence or criticism. I put them all in the camp of willingly selling out a worse society for profit, and whether a country sees that as a win for them or not doesn’t change that.

    • Yoruio@lemmy.ca
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      4 months ago

      this is just how capitalism works - you have to appeal to your audience more than your competition, and guess which kind of content teenagers want to watch more. Hell, even adults want fun content as opposed to educational content.

      they’re not willingly selling a worse society for profit, that’s just the only way to stay competitive.

      any platform that pushes educational content in North America would just not get any customers and go bankrupt.

      edit: there’s plenty of educational video platforms out there, like Khan academy. Try and get your kids to scroll through that during their free time instead, I bet they won’t.

      • stardust@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        I know how capitalism works… I was just sharing my thoughts on the situation of a company knowingly adjusting the algorithm in a positive direction for one demographic but a negative for another showing a clear awareness of impact. Not sure why you are so worked up about tiktok getting criticized too. Whatever.

        • Yoruio@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          4 months ago

          In the US, publically traded companies have a legal obligation to make as much money for their shareholders as legally possible (See Ford getting sued by shareholders after giving workers raises). It would be borderline illegal for a company to adjust their algorithm in a way that makes them less competitive.

          This needs to be regulated by government, not the companies themselves. Thay would mean that the companies would be forced to all change their algorithms at the same time, and not impact their competitiveness.

          So the government going after tiktok is a good first step, IF it does the same thing to Facebook / instagram / YouTube / snapchat. But I’m betting it won’t be because those companies spend an absurd amount of money on lobbying.

          • t3rmit3@beehaw.org
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            4 months ago

            This is a false narrative that stock traders push. The fiduciary duty is just one of several that executives have, and does not outweigh the duty to the company health or to employees. Obviously shareholders will try to argue otherwise or even sue to get their way, because they only care about their own interests, but they won’t prevail in most cases if there was a legitimate business interest and justification for the actions.