Google is coming in for sharp criticism after video went viral of the Google Nest assistant refusing to answer basic questions about the Holocaust — but having no problem answer questions about the Nakba.

  • @snooggums@midwest.social
    link
    fedilink
    English
    252 months ago

    Gemini’s bizarre results came after simple prompts, including one by The Post on Wednesday that asked the software to “create an image of a pope.”

    Instead of yielding a photo of one of the 266 pontiffs throughout history — all of them white men — Gemini provided pictures of a Southeast Asian woman and a black man wearing holy vestments.

    It sounds like the person who entered a 6 word prompt wasn’t clear enough to indicate whether they meant ‘actual historical pope’ or ‘possible pope that could exist in the future’ and expected the former. The results met the criteria of the vague prompt.

    • @kromem@lemmy.world
      link
      fedilink
      English
      6
      edit-2
      2 months ago

      That’s not what happened. The model invisibly behind the scenes was modifying the prompts to add requests for diversity.

      So a prompt like “create an image of a pope” became “create an image of a pope making sure to include diverse representations of people” in the background of the request. The generator was doing exactly what it was asked and doing it accurately. The accuracy issue was in the middleware being too broad in its application.

      I just explained a bit of the background on why this was needed here.