Google is coming in for sharp criticism after video went viral of the Google Nest assistant refusing to answer basic questions about the Holocaust — but having no problem answer questions about the Nakba.
Google is coming in for sharp criticism after video went viral of the Google Nest assistant refusing to answer basic questions about the Holocaust — but having no problem answer questions about the Nakba.
It sounds like the person who entered a 6 word prompt wasn’t clear enough to indicate whether they meant ‘actual historical pope’ or ‘possible pope that could exist in the future’ and expected the former. The results met the criteria of the vague prompt.
That’s not what happened. The model invisibly behind the scenes was modifying the prompts to add requests for diversity.
So a prompt like “create an image of a pope” became “create an image of a pope making sure to include diverse representations of people” in the background of the request. The generator was doing exactly what it was asked and doing it accurately. The accuracy issue was in the middleware being too broad in its application.
I just explained a bit of the background on why this was needed here.
There’s beet at least one female pope. So it’s not technically wrong.
That’s never been definitely proven
It’s a religious thing. Belief is everything.