A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • @Ultragigagigantic
    link
    English
    48 months ago

    It’s gonna suck no matter what once the technology became available. Perhaps in a bunch of generations there will be a massive cultural shift to something less toxic.

    May as well drink the poison if I’m gonna be immersed in it. Cheers.

    • @VinnyDaCat
      link
      English
      38 months ago

      I was really hoping that with the onset of AI people would be more skeptical of content they see online.

      This was one of the reasons. I don’t think there’s anything we can do to prevent people from acting like this, but what we can do as a society is adjust to it so that it’s not as harmful. I’m still hoping that the eventual onset of it becoming easily accessible and useable will help people to look at all content much more closely.