Godric@lemmy.world to Lemmy Shitpost@lemmy.world · 7 months agoAh, Yes! AI Will Surely Save Us All!lemmy.worldimagemessage-square69fedilinkarrow-up1833
arrow-up1833imageAh, Yes! AI Will Surely Save Us All!lemmy.worldGodric@lemmy.world to Lemmy Shitpost@lemmy.world · 7 months agomessage-square69fedilink
minus-squareDaxtron2@startrek.websitelinkfedilinkarrow-up5·7 months agoHow can text ever possibly be CSAM when there’s no child or sexual abuse involved?
minus-squareJimmyeatsausage@lemmy.worldlinkfedilinkarrow-up2·7 months agoI didn’t say anything about text?
minus-squareDaxtron2@startrek.websitelinkfedilinkarrow-up4·7 months agoWhat exactly do you think erotic roleplay means?
minus-squareJimmyeatsausage@lemmy.worldlinkfedilinkarrow-up1·7 months agoWell, I honestly hadn’t considered someone texting with a LLM, I was more thinking about AI generated images.
minus-squareweker01@feddit.delinkfedilinkarrow-up2·7 months agoText even completely fictional can be CSAM based on jurisdiction.
minus-squareDaxtron2@startrek.websitelinkfedilinkarrow-up1·7 months agoI’ve seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.
minus-squareweker01@feddit.delinkfedilinkarrow-up2·7 months agoInternationally? I know that in Germany there are cases.
How can text ever possibly be CSAM when there’s no child or sexual abuse involved?
I didn’t say anything about text?
What exactly do you think erotic roleplay means?
Well, I honestly hadn’t considered someone texting with a LLM, I was more thinking about AI generated images.
Text even completely fictional can be CSAM based on jurisdiction.
I’ve seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.
Internationally? I know that in Germany there are cases.