• @lepinkainen
    link
    English
    43 days ago

    Yep, it’s a legal “think of the children” requirement. They’ve been doing CSAM scanning for decades already and nobody cared.

    When Apple did a system that required MULTIPLE HUMAN-VERIFIED matches of actual CP before even a hint would be sent to the authorities, it was somehow the slippery slope to a surveillance state.

    The stupidest ones were the ones who went “a-ha! I can create a false match with this utter gibberish image!”. Yes, you can do that. Now you’ve inconvenienced a human checker for 3 seconds, after the threshold of local matching images has been reached. Nobody would’ve EVER get swatted by your false matches.

    Can people say the same for Google stuff? People get accounts taken down by “AI” or “Machine learning” crap with zero recourse, and that’s not a surveillance state?