- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.
Legal and moral are not the same thing.
Do you also think it’s immoral to do street photography?
I think it’s immoral to do street photography to sexualize the subjects of your photographs. I think it’s immoral to then turn that into pornography of them without their consent. I think it’s weird you don’t. If you can’t tell the difference between street photography and using and manipulating photos of people (public or otherwise) into pornography I can’t fuckin help you
If you go to a park, take photos of people, then go home and masturbate to them you need to seek professional help.
What’s so moronic about people like you, is you think that anyone looking to further understand an issue outside of your own current thoughts, clearly is a monster harming people in the worst way you can conjure in your head. The original person saying it’s weird you’re looking for trouble couldn’t have been more dead on.
This is an app that creates nude deepfakes of anyone you want it to. It’s not comparable to street photography in any imaginable way. I don’t have to conjure any monsters bro I found one and they’re indignant about being called out as a monster.
This has been done with Photoshop for decades. Photocollage for a hundred years before that. Nobody is arguing that it’s not creepy. It’s just that nothing has changed.