Nearly a year after AI-generated nude images of high school girls upended a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year of probation.

But the artificial intelligence tool used to create the harmful deepfakes is still easily accessible on the internet, promising to “undress any photo” uploaded to the website within seconds.

Now a new effort to shut down the app and others like it is being pursued in California, where San Francisco this week filed a first-of-its-kind lawsuit that experts say could set a precedent but will also face many hurdles.

  • @[email protected]
    link
    fedilink
    173 months ago

    I can’t imagine the difficulty of resolving this, especially since most of the AI models are available for free use.

    • @teejay
      link
      English
      9
      edit-2
      3 months ago

      Yeah it’s a chimera hydra, similar to illegal movie streaming sites. Unless they solve it at the AI engine level, they’re just chasing ghosts.

      • Æther
        link
        93 months ago

        Do you mean hydra?

        • @teejay
          link
          English
          43 months ago

          Shit, my bad. Yep I meant hydra. I need to brush up on my Greek mythology.

      • @[email protected]
        link
        fedilink
        73 months ago

        You won’t prevent it without (or even with) unacceptable restrictions on free speech. Those models have a right to exist.

        But you can raise the barrier to entry so people will need to run their own service to do it. You’ll make a crazy dent in middle school kids spreading fake nudes of their classmates if they can’t just use a managed online service.

    • @foggy
      link
      13 months ago

      It’s a hydra. It’s effectively impossible.

  • @ikidd
    link
    English
    143 months ago

    This is a dog and pony show. Between the unlikelihood of a municipality holding any sway over an internet site, it completely ignores the futility of trying to close the Pandora’s box that is AI imagery. I can download half a dozen deepfake video models that would run on a home server, let alone still imaging.

    This about the level of technological savvy I would expect from a city councillor.

    • @[email protected]
      link
      fedilink
      -1
      edit-2
      3 months ago

      Predictions:

      Men: “I didn’t film gay porn, that’s a deepfake.” The issue is dropped.

      Women: “Those videos you’re sharing aren’t real, use my image without my consent, and I want them removed.” She is called a slut.

      • @ikidd
        link
        English
        -63 months ago

        Oh, quit clutching your pearls.

        • @njm1314
          link
          0
          edit-2
          3 months ago

          Good Lord how naive are you that you don’t think there’s a difference? I mean honestly what fucking world do you live in? I honestly am amazed that someone could still not understand this. Where have you been the last few weeks?

          • @[email protected]
            link
            fedilink
            4
            edit-2
            3 months ago

            Those who enjoy treating women as sexual objects will always be offended by the idea that this exploitation is somehow unfair. Furthermore, they will take great offence at this being pointed out, but will never give an explanation for their offence other than by becoming more offended.

    • @[email protected]
      link
      fedilink
      263 months ago

      Most of the time it is against females. Same as rape but guys are still raped too. It is that quiet reality that most people don’t acknowledged. Slowly new laws are catching up but it is still one side focused.

    • @[email protected]
      link
      fedilink
      English
      33 months ago

      It’s a lawsuit against the sites, so it’ll cover those too. But I don’t think that AI-generated nudes of boys has been a specific problem yet.

    • @njm1314
      link
      13 months ago

      Way to all lives matter the topic buddy

  • @WhatsHerBucket
    link
    53 months ago

    San Francisco and the whole internet police?

    It’s an Herculean effort and sadly there’s no way to close the box now, but every bit helps.

  • @xc2215x
    link
    33 months ago

    Good for San Francisco.

  • @[email protected]
    link
    fedilink
    33 months ago

    Oh thank Satan it’s just the women and girls. How else can a gay man with a fetish for twinks in clown costumes have any fun without AI generated porn.

    (humor)

  • @[email protected]
    link
    fedilink
    13 months ago

    We need a very low barrier of entry to generate gay porn from a single image of a male before this problem will be taken seriously.

  • Media Bias Fact CheckerB
    link
    -103 months ago
    Associated Press - News Source Context (Click to view Full Report)

    Information for Associated Press:

    MBFC: Left-Center - Credibility: High - Factual Reporting: High - United States of America
    Wikipedia about this source

    Search topics on Ground.News

    https://apnews.com/article/deepfake-porn-lawsuit-san-francisco-53ff0a8de1fb56cc0743eb8108b18f82
    https://apnews.com/article/generative-ai-illegal-images-child-abuse-3081a81fa79e2a39b67c11201cfd085f

    Media Bias Fact Check | bot support