- cross-posted to:
- technology
Why does it matter? If they share them it’s obviously bad, but if they keep them to themselves it harms no one.
It wouldnt be be news if they didn’t share them.
If they didn’t, nobody would know and nobody could care
Outside of the invasuon of privacy, they are at a highschool. There’s a high chance that the material is CP, even if fake.
Well, apparently not csam according to the law.
Also, putting your public face on a ai body has nothing to do with privacy.
If anything it’s more akin to trademark or copyright on your own likeness.
Idk, it’s all weird and fucked up, but csam and privacy violating its not.
It’s possible that other New Jersey laws, like those prohibiting harassment or the distribution of child sexual abuse materials, could apply in this case. In April, New York sentenced a 22-year-old man, Patrick Carey, to six months in jail and 10 years of probation “for sharing sexually explicit ‘deepfaked’ images of more than a dozen underage women on a pornographic website and posting personal identifying information of many of the women, encouraging website users to harass and threaten them with sexual violence.” Carey was found to have violated several laws prohibiting harassment, stalking, child endangerment, and “promotion of a child sexual performance,” but at the time, the county district attorney, Anne T. Donnelly, recognized that laws were still lacking to truly protect victims of deepfake porn.
Like the other said, obv they weren’t just keeping them secret bc people found out.
Additionally, condoning actions like this must have implications on objectification of women, which should be socially condemned.