pavnilschanda@lemmy.worldM to AI Companions@lemmy.world · 9 months ago[News] Somebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square3fedilinkarrow-up139cross-posted to: technology@lemmy.world
arrow-up139external-link[News] Somebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangepavnilschanda@lemmy.worldM to AI Companions@lemmy.world · 9 months agomessage-square3fedilinkcross-posted to: technology@lemmy.world
minus-squareholycrap@lemm.eelinkfedilinkarrow-up2·9 months agoHas anyone been able to reproduce this? The guy could have typed all that in a previous prompt.
Has anyone been able to reproduce this? The guy could have typed all that in a previous prompt.
Yes, word for word