@corb3t to [email protected] • edit-21 year agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.commessage-square214arrow-up1272arrow-down161file-textcross-posted to: technology[email protected][email protected][email protected]
arrow-up1211arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.com@corb3t to [email protected] • edit-21 year agomessage-square214file-textcross-posted to: technology[email protected][email protected][email protected]
minus-square@[email protected]linkfedilink1•1 year agoThere is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content Shadow banning those users would be nice too
minus-square@diffuselightlink2•1 year agoThey are talking about AI generated images. That’s the volume part.
There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content
Shadow banning those users would be nice too
They are talking about AI generated images. That’s the volume part.