Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.
Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.
deleted by creator
The feature had major authoritarian problems.
Nobody is arguing FOR csam, but the number of terrible things that have been done “in the name of the children” is huge and this one was a bigger problem.
Once you cross the line into peering into my photos, the possibilities for abuse of that feature are huge. I’d bet that some governments asked them to scan for other content too and Apple didn’t like what they were being asked to do.
In fact throughout history these types of “for the children” laws and rules have been created under false pretences so much that the Simpsons have a running gag of it.
“Won’t someone please think of the children!!!”
deleted by creator
Cool, and what happens when the Chinese government feeds hashes of Tiananmen Square?
deleted by creator
Wasn’t yet on the list.
But what are the barriers to them asking and getting on the list?
And why would the government be on the “list of hashes”?
The list of hashes are hashes for images, not keys to authenticate people.
Your argument doesn’t even make sense, and then you attack my character. Observe “fallacy fallacy” as well as “ad hominem”.
Just because you CAN apply a fallacy does not make my argument false. Slippery slopes DO exist, but not everything is a slippery slope. Attacking my character with condescension exposes a distraction from a weak central point.