This might beat OpenAI

  • kitnaht@lemmy.world
    link
    fedilink
    English
    arrow-up
    133
    ·
    edit-2
    6 months ago

    Everyone is abandoning their safety advisory council just before elections…I wonder why…

    • shneancy@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      6 months ago

      I’m not much into American politics and have not read a single paragraph of TOS in my life. Could you please explain?

      • paraphrand@lemmy.world
        link
        fedilink
        English
        arrow-up
        29
        ·
        6 months ago

        It’s because doing the right thing when it comes to algorithmic social spaces means destroying the golden goose.

      • Viking_Hippie@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        ·
        edit-2
        6 months ago

        There’s tons of disinformation being spewed onto social media and all other online platforms leading up to an election.

        That disinformation in the form of paid ads and engagement-driving viral posts is officially against the rules everywhere, but if the rules aren’t enforced, they’re meaningless.

        TL;DR: less effective enforcement against disinformation = more profits for the platforms

      • Chainweasel@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        6 months ago

        To them, Misinformation = Money.
        By disbanding safety and ethics councils and committees they are allowing for greater volumes of misinformation to flow through their services.
        Misinformation tends to increase closer to elections.

      • TheGalacticVoid@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        None of the other comments explain why misinformation makes money.

        It’s because stuff like misinformation, lies by omission, and rage bait all tend to incite very emotional responses in people. These people then engage with the post via likes, retweets, viewership, dislikes etc. that tell the algorithm to increase said post’s reach to more people. Platforms want to consume as much of people’s time as possible for monetization purposes, so misinformation really helps them.

        That being said, all of this is pure speculation. If I were to guess why they disbanded their T&S team strictly from the headline, it would be because of incompetence.

  • MajorHavoc@programming.dev
    link
    fedilink
    English
    arrow-up
    41
    ·
    edit-2
    6 months ago

    I mean, I keep saying that literally a random person can do many tasks better than the average current generation AI.

    I should have been more specific. I just meant stuff like knowing how many limbs a human should have…

    • sp3ctr4l@lemmy.zip
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      Anybody want to make a petition to get Kwebblekop’s AI persona on the council?

  • Chozo@fedia.io
    link
    fedilink
    arrow-up
    19
    ·
    6 months ago

    Ah cool, now I get to watch Mizkif do unban requests for the whole platform.

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    14
    ·
    edit-2
    6 months ago

    Not that I think the new team is likely to do a great job, but it can’t get much worse than the past few years of Twitch streamers living in fear of being randomly banned for undisclosed reasons.

    • beefbot
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 months ago

      Please, I beg you, don’t make someone out there say “hold my beer”