Last month, a detective in a small town outside of Lancaster, Pennsylvania, invited dozens of high school girls and their parents to the police station to undertake a difficult task: one by one, the girls were asked to confirm that they were depicted in hundreds of AI-generated deepfake pornographic images seized by law enforcement.

In a series of back-to-back private meetings, Detective Laurel Bair of the Susquehanna Regional Police Department slid each image out from under the folder’s cover, so only the girl’s face was shown, unless the families specifically requested to see the entire uncensored image.

“It made me a lot more upset after I saw the pictures because it made them so much more real for me,” one Lancaster victim, now 16, told Forbes. “They’re very graphic and they’re very realistic,” the mother said. “There’s no way someone who didn’t know her wouldn’t think: ‘that’s her naked,’ and that’s the scary part.” There were more than 30 images of her daughter.

The photos were part of a cache of images allegedly taken from 60 girls’ public social media accounts by two teenage boys, who then created 347 AI-generated deepfake pornographic images and videos, according to the Lancaster County District Attorney’s Office. The two boys have now been criminally charged with 59 counts of “sexual abuse of children,” and 59 counts of “posession of child pornography,” among other charges, including “possession of obscene materials depicting a minor.”

  • @orclev
    link
    2717 hours ago

    I think the argument in this case isn’t that a crime wasn’t committed, but rather charging a minor for CSAM possession is inappropriate (particularly when the images are fake). Perhaps a different law needs to be made for these highly specific cases, as the existing CSAM laws typically carry very hefty sentences that don’t seem entirely appropriate in a case like this.

    • @[email protected]
      link
      fedilink
      8
      edit-2
      16 hours ago

      Are they fake? They are the faces of real children in sexual and pornograpic images.

      I agree there should be more specific laws, but this still seems to fall under the current ones to me. These are not fully artifical CSAM, which is fucked up but has no living victim. These are sexual pictures of real children, that just have most of the sexual part generated. Thats much, much closer to full on CSAM then the above, and falls under the “spirit” of the law, which is to punish people that abuse children for sex. That is what these other children did to these 60 girls.

      • @orclev
        link
        1516 hours ago

        You have to admit there is a pretty fundamental difference between manipulating an otherwise legal image to look like a minor in a sexual act vs an actual photo of that same minor engaged in a sexual act. While both might be considered a crime, the damage to the victim is of a fundamentally different nature. I think there’s a strong argument that the former bears a closer relationship to slander than it does to rape.

        • @[email protected]
          link
          fedilink
          14
          edit-2
          15 hours ago

          I agree the two are different, but not as different as you seem to think. None of these girls were raped, but this is still sexual abuse, especially because these images were shared.

          Sexual abuse is complex, and far surpasses “slander,” especially in ones formative years. This act of sexual abuse is going to change how 60 girls and soon to be woman respond to sex, likely for the rest of their lives. These images may follow them forever, causing heartache, job loss, on and on, and the damage will be done because this is a form of CSAM of them that is in the world.

          That is not a light matter to be sidelined to a “hand slap” level of offense. I think the fact the perpetrators were also children should play heavily in their defense, but otherwise this needs to be treated as the sexually damaging event it is.

          • @[email protected]
            link
            fedilink
            814 hours ago

            Isn’t all of that still kinda of true regardless of the age of the subjects? If they were 18 or 30 it isn’t magically better.

            Revenge Porn might be a closer analogue. CSAM laws feel like they’ll get loopholed somehow, like idk if can just ask the AI to make the person aged up or whatever and get away with it.

          • @[email protected]
            link
            fedilink
            514 hours ago

            It does seem like there needs to be a new law specifically addressing this. In the past someone could have cut out the heads of a 17 year old and pasted it on top of a playboy model. That’s an obvious fake, but I don’t think it is the same as what’s happened here. But to a degree there are similarities. Does the ability to detect a fake matter? I don’t know. There are applications that can determine if a picture is AI generated with some level of confidence. Does that mean only human opinion matters? Again, I don’t know. Certainly there was no abuse at the time the image was taken, so there is a difference with this and CP.

          • @orclev
            link
            3
            edit-2
            14 hours ago

            I don’t think this is “hand slap” level, but it also isn’t multiple decades behind bars level which is what they would be looking at for that quantity of CSAM, particularly for a couple of horny teenagers that likely weren’t even sure what they were doing was illegal. I do think you’re over exaggerating somewhat the harm in this case as fundamentally what was done isn’t much different from something like cutting out photos of these girls heads and pasting them into a porn magazine. It’s certainly fancier and more convincing, but at the end of the day that’s what happened, their faces got superimposed on the bodies of porn stars. That likely bothered these girls in the same way the thought of some random creep jerking off to their original photos would, and if the images were widely circulated it could cause some issues down the line (heartache certainly, but job loss certainly not), but if this bothered them enough to alter the way they feel about sex for the rest of their lives there were already significant mental issues at play.

            I honestly don’t know exactly what an appropriate level of punishment would be. My gut says something like 6 months to a year in juvenile detention plus some years of probation. I think a significant amount of weight needs to be given to the fact that these were a couple of teenagers doing something that wasn’t obviously illegal. They cannot and should not be held to the same standards as adults would be for the same reason statutory rape is a thing, they’re incapable of reasoning about their actions to the same degree as an adult is.

          • @Serinus
            link
            -1
            edit-2
            12 hours ago

            This act of sexual abuse is going to change how 60 girls and soon to be woman respond to sex, likely for the rest of their lives. These images may follow them forever

            No, it’s not. No, it shouldn’t.

            First, it’s so, so much easier to deal with when you have the response of “that’s not me”. Second, it’s current AI. How real do these things even look?

            These girls were not sexually abused. Sexual harassment is more a appropriate crime. Maybe libel. Maybe a new crime that we can call “sexual libel” or something.

            • Coskii
              link
              fedilink
              -159 minutes ago

              Current AI for generating sexual images is on the real side of the uncanny valley at this point. If you’re really looking you might be able to tell, but I don’t think most people looking for porn are going to scrutinize anything too closely in the first place… So real enough.

              However, I don’t see how 60 images of what’s effectively a face plastered on a miscellaneous body doing something sexual would follow anyone for anything. Anyone who knows of them and outs themselves just admitted to child porn…

              Most people don’t have such unique facial features that would be something that could even follow them in the first place.

              As for the criminal aspect of it, that’s a societal thing to figure out, so here they go figuring it out.