- cross-posted to:
- technology
- cross-posted to:
- technology
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM).
You must log in or register to comment.
Yes, CSAM is bad.
However, false positives from scans also have the possibility to destroy lives. While I wouldn’t cry about Apple losing millions in false-postive related lawsuits, it’s simply not a good thing in this case.