• Kbin_space_program@kbin.social
    link
    fedilink
    arrow-up
    18
    ·
    7 months ago

    It’s a core problem with image generator LLMs. For some fucking reason they seem to have fed them the content from sites that had a lot of porn. Guessing Imgur and Deviantart.

    Literally the first time I tried to use MS’s image generator, was out with some friends trying a new fried chicken place and we were discussing fake tinder profiles.

    So I thought to try it and make a fake image of “woman senuously eating fried chicken”.
    Content warning, blah blah blah.

    Try “Man sensuously eating fried chicken”. Works fine.

    We were all mystified by that. I went back a few days later to play around. Tried seeing what it didn’t like. Tried generating “woman relaxing at park”.
    Again, content warning. Switch to a man, no problem. Eventually got it to generate with “woman enjoying sunset in a park.” Got a very dark image, because it generated a completely nude woman T-posing in the dark.

    So, with that in hand I went back and started specifying “fully clothed” for a prompt involving the word “woman”. All of a sudden all of the prompts worked. They fed the bot so much porn that it defaulted women to being nude.

    • Neato@ttrpg.network
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 months ago

      Lol at t-posing pornography.

      I find the same problem when searching for D&D portraits. Men? Easy and varied. Women? Hypersexualized and mostly naked. I usually have to specific old women to prevent that.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        To be fair, D&D was historically a game for neckbeards (at least that was the stigma/stereotype), so hypersexualized women fits the bill.

    • Taako_Tuesday@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      Doesn’t it also have to do with the previous requests the LLM has recieved? In order for this thing to “learn” it has to know what people are looking for, so i’ve always imagined the porn problem as being a result of the fact that people are using these things to generate porn at a much greater volume than anything else, especially porn of women, so it defaults to nude because that’s what most requests were looking for

      • TheRealKuni@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        Nah, most of these generative models don’t account for previous requests. There would be some problems if they did. I read somewhere that including generative AI data in generative AI training has a feedback effect that can ruin models.

        It’s just running a bunch of complicated math against previously trained algorithms.