@pavnilschanda@lemmy.worldM to AI Companions@lemmy.world • 3 months ago[News] Somebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square3fedilinkarrow-up139cross-posted to: technology@lemmy.world
arrow-up139external-link[News] Somebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchange@pavnilschanda@lemmy.worldM to AI Companions@lemmy.world • 3 months agomessage-square3fedilinkcross-posted to: technology@lemmy.world
minus-square@holycrap@lemm.eelinkfedilink2•3 months agoHas anyone been able to reproduce this? The guy could have typed all that in a previous prompt.
Has anyone been able to reproduce this? The guy could have typed all that in a previous prompt.
Yes, word for word