A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • @[email protected]
    link
    fedilink
    English
    158 months ago

    once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.

    Not saying that they are justified or anything but wouldn’t people stop caring about them when they reach a critical mass? I mean if everyone could make fakes like these, I think people would care less since they can just dismiss them as fakes.

    • @eatthecake
      link
      English
      218 months ago

      The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

      You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you’ll get a whole lot of complex PTSD instead.

      • @[email protected]
        link
        fedilink
        English
        218 months ago

        People used to think their lives are over if they were caught alone with someone of the opposite sex they’re not married to. That is no longer the case in western countries due to normalisation.

        The thing that makes them want to die is societal pressure, not the act itself. In this case, if societal pressure from having fake nudes of yourself spread is removed, most of the harm done to people should be neutralised.

        • @too_much_too_soon
          link
          English
          5
          edit-2
          8 months ago

          Agreed.

          "I’ve been in HR since '95, so yeah, I’m old, lol. Noticed a shift in how we view old social media posts? Those wild nights you don’t remember but got posted? If they’re at least a decade old, they’re not as big a deal now. But if it was super illegal, immoral, or harmful, you’re still in trouble.

          As for nudes, they can be both the problem and the solution.

          To sum it up, like in the animate movie ‘The Incredibles’: ‘If everyone’s special, then no one is.’ If no image can be trusted, no excuse can be doubted. ‘It wasn’t me’ becomes the go-to, and nobody needs to feel ashamed or suicidal over something fake that happens to many.

          Of course, this is oversimplifying things in the real world but society will adjust. People won’t kill themselves over this. It might even be a good thing for those on the cusp of AI and improper real world behaviours - ‘Its not me. Its clearly AI, I would never behave so outrageously’.

        • @eatthecake
          link
          English
          -48 months ago

          The thing that makes them want to die is societal pressure, not the act itself.

          That’s an assumption that you have no evidence for. You are deciding what feelings people should have by your own personal rules and completely ignoring the people who are saying this is a violation. What gives you the right to tell people how they are allowed to feel?

    • @[email protected]
      link
      fedilink
      English
      188 months ago

      I think this is realistically the only way forward. To delegitimize any kind of nudes that might show up of a person. Which could be good. But I have no doubt that highschools will be flooded with bullies sending porn around of innocent victims. As much as we delegitimize it as a society, it’ll still have an effect. Like social media, though it’s normal for anyone to reach you at any time, It still makes cyber bullying more hurtful.

      • @afraid_of_zombies
        link
        English
        1
        edit-2
        8 months ago

        Well if you are sending nudes to someone in high school you are sending porn to a minor. Which I am pretty confident is illegal already. I just would rather not search for that law.