Nearly a year after AI-generated nude images of high school girls upended a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year of probation.

But the artificial intelligence tool used to create the harmful deepfakes is still easily accessible on the internet, promising to “undress any photo” uploaded to the website within seconds.

Now a new effort to shut down the app and others like it is being pursued in California, where San Francisco this week filed a first-of-its-kind lawsuit that experts say could set a precedent but will also face many hurdles.

    • @[email protected]
      link
      fedilink
      261 month ago

      Most of the time it is against females. Same as rape but guys are still raped too. It is that quiet reality that most people don’t acknowledged. Slowly new laws are catching up but it is still one side focused.

    • @[email protected]
      link
      fedilink
      English
      31 month ago

      It’s a lawsuit against the sites, so it’ll cover those too. But I don’t think that AI-generated nudes of boys has been a specific problem yet.

    • @njm1314
      link
      11 month ago

      Way to all lives matter the topic buddy