Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • them@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    ·
    1 year ago

    Yes, lets name the tool in the article so everybody can participate in the abuse

      • DarkThoughts@kbin.social
        link
        fedilink
        arrow-up
        11
        ·
        1 year ago

        Considering that AI services typically cost money, especially those advertising adult themes, it kinda does do support the hosters of such services.

        • RaivoKulli@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          12
          ·
          1 year ago

          Then again, naming and shaming puts pressure on them too. But in the end I doubt it matters. Those who want to use them will find them.

          • DarkThoughts@kbin.social
            link
            fedilink
            arrow-up
            5
            ·
            1 year ago

            Of course, which isn’t even the problem but rather people using the edited pictures for things like blackmail or whatever. From a technical standpoint it isn’t too dissimilar to the old photoshopping. Face swapping can probably even provide much higher quality results, especially if you have a lot of source material to pull from (you want like matching angles for an accurate looking result). Those AI drawn bodies often have severe anatomical issues that make them very obvious and look VERY different to their advertisement materials.

          • 30p87@feddit.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            True. Especially as just googling ‘undress AI free’ yields tons of results which may be less or more legit.

    • Rediphile@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      You can literally Google ‘AI nude generation tool’ and get multiple results already. And I do sort of agree with you as I’m not sure how naming this specific tool was necessary or beneficial here. But I don’t think not naming it is going to prevent anyone interested in such a tool from finding one. The software/tool itself is (currently) not illegal.