• rustyfish@lemmy.world
    link
    fedilink
    English
    arrow-up
    79
    ·
    5 months ago

    “Many girls were completely terrified and had tremendous anxiety attacks because they were suffering this in silence,” she told Reuters at the time. “They felt bad and were afraid to tell and be blamed for it.”

    WTF?!

    • sigmaklimgrindset@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      54
      ·
      5 months ago

      Spain is a pretty Catholic country, and even if religious attendance is dropping off, the ingrained beliefs can still remain. Madonna/Whore dichotomy still is very prevalent in certain parts of society there.

      • sam@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        pretty Catholic

        I don’t know what led you to believe that, but just look at wikipedia, only 56% of the population is catholic, 37.5% being non practising (and, in my experience as a spaniard, agnostic) and 16% actually practising.

        https://en.m.wikipedia.org/wiki/Spain

        • sigmaklimgrindset@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          You read the first 6 words of my comment and just ignored the rest of it. Tell me why Holy Week is one of the biggest events in Spain even though “only” half the population is Catholic.

          The whole point I was making was that even if people identify as atheists, agnostics, or non-practicing, the remnants of the Catholic mindset and culture remain, including the misogyny inherent to most organized religions.

  • IllNess@infosec.pub
    link
    fedilink
    English
    arrow-up
    41
    ·
    5 months ago

    They are releasing stories like this to promote the new that requires adults to login to pornsites and to limit their use of it.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    16
    ·
    5 months ago

    This is the best summary I could come up with:


    A court in south-west Spain has sentenced 15 schoolchildren to a year’s probation for creating and spreading AI-generated images of their female peers in a case that prompted a debate on the harmful and abusive uses of deepfake technology.

    Police began investigating the matter last year after parents in the Extremaduran town of Almendralejo reported that faked naked pictures of their daughters were being circulated on WhatsApp groups.

    Each of the defendants was handed a year’s probation and ordered to attend classes on gender and equality awareness, and on the “responsible use of technology”.

    Under Spanish law minors under 14 cannot be charged but their cases are sent to child protection services, which can force them to take part in rehabilitation courses.

    In an interview with the Guardian five months ago, the mother of one of the victims recalled her shock and disbelief when her daughter showed her one of the images.

    “Beyond this particular trial, these facts should make us reflect on the need to educate people about equality between men and women,” the association told the online newspaper ElDiario.es.


    The original article contains 431 words, the summary contains 181 words. Saved 58%. I’m a bot and I’m open source!

    • Zeratul@lemmus.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 months ago

      What does this have to do with the equality of men and women? Girls are more at risk of this kind of abuse? That’s a good point, but it’s not brought up here. This parent is trying to make something political that is simply not. Not that gender equality should be political in the first place.

      • 0laura@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        There’s a good chance that these behaviors originated from misogyny/objectifying women.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    14
    ·
    5 months ago

    “Absolutely no way to prevent this”, says internet full of banners offering to “Undress your classmates now!”

    “Tools are just tools, and there’s no sense in restricting access to undress_that_ap_chemistry_hottie.exe because it wouldn’t prevent even a single case of abuse and would also destroy every legitimate use of any computer anywhere”, said user deepfake-yiff69@lemmy.dbzer0.com

    • spamfajitas@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      5 months ago

      It’s possible I just haven’t come across those types of comments you’re making fun of, but I usually just see people making the case that we don’t need new, possibly overreaching, legislation to handle these situations. They want to avoid a disingenuous “think of the children” kind of situation.

      a youth court in the city of Badajoz said it had convicted the minors of 20 counts of creating child abuse images and 20 counts of offences against their victims’ moral integrity

      I’m not familiar with their legal system but I would be willing to bet the crimes they’ve committed were already illegal under existing laws.

  • Sensitivezombie@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    ·
    5 months ago

    Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.

    • Duamerthrax@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 months ago

      How? How could you make an algorithm that correctly identified what nude bodies look like? Tumblr couldn’t differentiate between nudes and sand dunes back when they enforced their new policies.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      This sounds great, but it’s one of those things that is infinitely easier to say than do. You’re essentially asking for one of two things: Manual human intervention for every single image uploaded, or “the perfect image recognition system.” And honestly, the first is fraught with its own issues, and the second does not exist.