A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • @SendMePhotos
    link
    English
    09 months ago

    Is that different than wanking to clothed photos of the same people?

    • @RageAgainstTheRich
      link
      English
      09 months ago

      The difference is that the image is fake but you can’t really see that its fake. Its so easily created using these tools and can be used to harm people.

      The issue isn’t that you’re jerking off to it. The issue is it can create fake photos of situations of people that can be incredibly difficult to deny it really happened.