‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.

  • brsrklf@jlai.lu
    link
    fedilink
    English
    arrow-up
    87
    ·
    1 year ago

    People that are completely desensitized to that kind of stuff would probably not be very good at moderating it really.

    Also this is a terrible job and I’d be very worried if a company was paying and enabling people who find that fun. It’s horrible, but trauma is the normal outcome.

        • Fisch@lemmy.ml
          link
          fedilink
          English
          arrow-up
          17
          ·
          1 year ago

          Maybe they still have the content that got removed because of that, you might be able to train an AI just on that. That way they don’t need to manually check it, it’s already been done after all.

          • webghost0101@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            13
            ·
            edit-2
            1 year ago

            Exactly, and even if the content uploaded disagrees and request human oversight that is just one image that needs to be checked rather then all. Ai may even be able to blur parts of footage that are most brutal and extreme and create written transcripts of audio You dont need 4k resolution and hearable screaming to understand that someone is getting murderered or Raped.

            • Fisch@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              True, didn’t think about that. Blurring and transcripts are also way more reliable, so it will work correctly almost always.

      • brsrklf@jlai.lu
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        I am of the kind that is very wary with what should or should not be an AI’s job, and you know what, in this very particular case, I think I agree.

        At least as a first filter, anyway.

      • ThePrivacyPolicy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Huge industries emerging in this field right now for everything from this type of social media moderation to helping fight CSAM more effectively so humans aren’t having to be a frontline for that type of material. This is one area I can really, really get behind AI on and see a very valid use case that isn’t just marketing hype like so many others. I know there’s some great stuff happening just based on my own field of employment and being close to a few things in the works this year.

    • fadingembers
      link
      fedilink
      English
      arrow-up
      20
      ·
      1 year ago

      Honestly I don’t see an issue with it. If they can tell the difference between an image that should be moderated and one that shouldn’t they can do the job and I seriously doubt the vast majority of people desensitized to that kind of content can’t tell the difference. That’s like the arguments that we shouldn’t make graphic games or movies because people won’t be able to tell the difference between them and reality. Not everyone can do every job and these people would be the perfect fit for it and we would spare others from getting hurt

      • odelik@lemmy.today
        link
        fedilink
        English
        arrow-up
        15
        ·
        1 year ago

        Desensitized doesn’t necessarily mean somebody doesn’t have reactions to something. It just means they can compartmentalize those reactions and move forward and deal with the ramifications later.

        EMTs, ER Doctors, and Nurses are largely desensitized to graphic trauma and can press through and get the job done. But that doesn’t mean that they don’t process those scenes later in both healthy and unhealthy ways (there’s a few study out there that show ER staff have higher rates of alcoholism and substance abuse rates than the general public).

        Tramua is trauma, whether you’re desensitized or not.

      • ParsnipWitch@feddit.de
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        1 year ago

        It would be a highly unethical but interesting research to see if those people experience long-term consequences nevertheless. Or if being desensitizes really does give someone immunity.

      • brsrklf@jlai.lu
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Except, you know, we’re talking people who are progressively desensitized to reality. So no, that’s not comparable at all.

      • Mr_Dr_Oink@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Exactly. If they couldn’t tell the difference, then how could they know which content to seek out for their own enjoyment. It might not affect them much, if at all anymore, but they know what ‘it’ looks like.

        Can you imagine them watching a cute cat video over and over and wondering why they aren’t getting the rush they must feel when watching gore.

        I remember in the early days of the internet, i clicked a link on a forum and ended up watching a video of some guy being decapitated. I have never forgotten that image, 20+ years later, and i know i would be checking into a mental hospital if i had the job these facebook staff have had to do. But there are people who like this sort of stuff, and its not because they have forgotten what decapitation looks like.