AI projects like OpenAI’s ChatGPT get part of their savvy from some of the lowest-paid workers in the tech industry—contractors often in poor countries paid small sums to correct chatbots and label images. On Wednesday, 97 African workers who do AI training work or online content moderation for companies like Meta and OpenAI published an open letter to President Biden, demanding that US tech companies stop “systemically abusing and exploiting African workers.”

A typical workday for African tech contractors, the letter says, involves “watching murder and beheadings, child abuse and rape, pornography and bestiality, often for more than 8 hours a day.” Pay is often less than $2 per hour, it says, and workers frequently end up with post-traumatic stress disorder, a well-documented issue among content moderators around the world.

  • stembolts@programming.dev
    link
    fedilink
    arrow-up
    24
    ·
    edit-2
    6 months ago

    Be grateful to be a slave. A slave is alive. You could be dead. Remember that you exist to create labor but it is someone else’s job to claim the value of that labor.

    Sorry, you were born into the wrong family, now please accept the status quo and do not revolt, rebel, or look into history and how the lower classes have handled power imbalances in the past.

  • Uriel238 [all pronouns]
    link
    fedilink
    English
    arrow-up
    8
    ·
    6 months ago

    Fuck! Is AI mechanical Turks all the way down?

    It’s a horror story that never ends is what it is.

    • linearchaos@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      Nah, all the original data came from humans. If it was all good and happy and properly tagged correctly there’d be no intervention.

      Unfortunately they scraped it from wherever in the hell they could get it from and it’s not all tagged correctly.

      I’m sure they use more AI to pre-grade it, but at some point a set of real eyes need to verify that something is what it’s supposed to be.

      This is more of a blood diamonds or fair trade coffee thing, US legislation isn’t going to have anything to do with it. You need to expose the places using the data.