‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @TrickDacy
    link
    English
    65 months ago

    Do you also think it’s immoral to do street photography?

    • @[email protected]
      link
      fedilink
      English
      -3
      edit-2
      5 months ago

      I think it’s immoral to do street photography to sexualize the subjects of your photographs. I think it’s immoral to then turn that into pornography of them without their consent. I think it’s weird you don’t. If you can’t tell the difference between street photography and using and manipulating photos of people (public or otherwise) into pornography I can’t fuckin help you

      If you go to a park, take photos of people, then go home and masturbate to them you need to seek professional help.

      • @TrickDacy
        link
        English
        2
        edit-2
        5 months ago

        What’s so moronic about people like you, is you think that anyone looking to further understand an issue outside of your own current thoughts, clearly is a monster harming people in the worst way you can conjure in your head. The original person saying it’s weird you’re looking for trouble couldn’t have been more dead on.

        • @[email protected]
          link
          fedilink
          English
          -15 months ago

          This is an app that creates nude deepfakes of anyone you want it to. It’s not comparable to street photography in any imaginable way. I don’t have to conjure any monsters bro I found one and they’re indignant about being called out as a monster.

          • @PopOfAfrica
            link
            English
            25 months ago

            This has been done with Photoshop for decades. Photocollage for a hundred years before that. Nobody is arguing that it’s not creepy. It’s just that nothing has changed.