- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.
Well yeah if you share the photo it’s messed up. Is anyone saying otherwise?
So it’s fine to violate someone’s privacy so long as you don’t share it? Weird morals you got there.
Am I violating privacy by picturing women naked?
Because if it’s as cut and dried as you say, then the answer must be yes, and that’s flat out dumb
I don’t see this as a privacy issue and I’m not sure how you’re squeezing it into that. I am not sure what it is, but you cannot violate a person’s privacy by imagining or coding an image of them. It’s weird creepy and because it can be mistaken for a real image, not proper to share.
Can you actually stop clutching pearls for a moment to think this through a little better?
Sexualizing strangers isn’t a right or moral afforded to you by society. That’s a braindead take. You can ABSOLUTELY violate someone’s privacy by coding an image of them. That’s both a moral and legal question with an answer.
Your comment is a self report.
Projection. Since you have no room for new thoughts in your head, I consider this a block request.
Being accused by you of projection is legitimately high comedy