corb3t@lemmy.world to Technology@lemmy.ml · edit-21 year agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square192fedilinkarrow-up1267file-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldITNsocial_issues@fedinews.netfediverse@lemmy.mltechnology@beehaw.orgtechnews@radiation.partynews_tech@lemmy.link
arrow-up1267external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comcorb3t@lemmy.world to Technology@lemmy.ml · edit-21 year agomessage-square192fedilinkfile-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldITNsocial_issues@fedinews.netfediverse@lemmy.mltechnology@beehaw.orgtechnews@radiation.partynews_tech@lemmy.link
minus-squareballs_expertlinkfedilinkarrow-up5·1 year agoThere is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content Shadow banning those users would be nice too
minus-squarediffuselight@lemmy.worldlinkfedilinkarrow-up2·1 year agoThey are talking about AI generated images. That’s the volume part.
There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content
Shadow banning those users would be nice too
They are talking about AI generated images. That’s the volume part.