• Rhynoplaz
      link
      English
      -620 days ago

      If I’m wrong, please help me understand, but this seems excessive. Hundreds of charges over pasting someone’s face on a nude CG body. From what I can tell, they weren’t trying to defame anyone, or post them publicly. (If this is not what happened, I’d be less sympathetic for the boys, but for the sake of defining the argument let’s assume there was no intent to defame or make the images public.)

      I understand the toxic concept of “boys will be boys” but this situation seems not much different from fantasizing about someone sexually, and drawing the image created in your head. Would THAT count as sexual abuse?

      I would not defend the boys if they took secret photos of the girls, or shared sexting images online, but it’s all pretend. Those aren’t photos of the girls. They are realistic drawings. (Which again, we are assuming were not created to embarrass, blackmail or defame anyone.)

      • subignition
        link
        fedilink
        1520 days ago

        Commenting without reading the article?

        Hundreds of charges over pasting someone’s face on a nude CG body.

        No, you have it backwards. They took what are implied to be full-body photos of the students and used AI to make them pornographic:

        The two defendants possessed and altered “hundreds of images of Lancaster Country Day School students, which they digitally altered using an artificial intelligence application to make the images appear pornographic,” according to the district attorney’s office.

        1. the classmates were children, so it’s child pornography

        have been charged with 59 counts each of sexual abuse of children and possession of child pornography after allegedly using artificial intelligence to create hundreds of nude images of female classmates.

        1. they were distributing it

        the deepfakes […] were shared on a social platform and have caused turmoil in the private school community. Adams said the defendants had shared the images with each other on the Discord social media platform; in total, police found 347 images and videos. That platform is how the images first came to light in November 2023, when one of the defendants “sent one of the altered images to a different chat room that consisted of other Lancaster Country Day School students, apparently in error,” according to Adams’ office.

        And in addition to that the school decided not to be proactive in reporting it because they allegedly weren’t required to

        Despite receiving a tip about the nude images in November 2023, school officials were not required “under the letter of the law to report the incident to ChildLine,” the state’s hotline for suspected child abuse, Adams said in a statement. School officials also never notified police.

        • Rhaedas
          link
          fedilink
          019 days ago

          The problem I see here is that some underage boys got caught doing things boys will do with the tech available to them and now have something on their record possibly for life. Meanwhile a few people in the admin quit rather than get penalized for their inaction to the discovery, and the system didn’t get improved to address something that will happen again, because young boys are ignorant of laws and will do things with technology that wasn’t possible at that level even a few years ago.

          I mean if we’re okay with kids who get found using easily available tech to fantasize about things being marked as sexual predators and shrugging with “nothing we could have done”, then that’s how this will continue. I do wonder how the law and school would typically deal with intercepting a hand drawn nude that resembled a classmate when it got passed around. Same thing?

          I empathize with the victim of the pictures too, which is why I wish there was a better answer than a criminal mark on the boys and a forced quitting of those who could have done better to confront it. Because I guarantee this is the case where the kids got caught, not the only ones that are actively doing it. And it will keep happening if parents and teachers aren’t willing to talk about sexual topics to the kids. This is just another level of porn that is more interactive and customizable.

          • socsa
            link
            fedilink
            English
            419 days ago

            Ok but if there no consequences for this it sends the message to every kid in the Future that this is fine to do. It sucks to be the example, but it is what it is

            • Rhaedas
              link
              fedilink
              219 days ago

              I didn’t say not to have consequences, I just question which ones were used and whether they even address the problem. Hell, part of the problem in this situation was there were NO consequences initially because the adults didn’t want to touch the subject of sex. They should have approached it from a therapeutic pov instead of a first ignore and then punish anyone involved.

              I seriously doubt any kid doing anything like this is going to run across this news and think, crap, I better find another outlet for my urges. And as I said before, the tech is out there online and using conventional equipment in private, so kids are absolutely trying things out. This is the 21st century version of pretending Johnny doesn’t have a Playboy hidden under his mattress because you don’t want to talk to him about anything awkward.

        • Rhynoplaz
          link
          English
          -320 days ago

          This particular article was paywalled, but I found another one. The difference in phrasing may have given a different impression, but I think you may be reaching on some of your points.

          hundreds of images of Lancaster Country Day School students, which they digitally altered using an artificial intelligence application to make the images appear pornographic,”

          Hundreds? You think they had nude photos of hundreds of students? That’s not plausible. It’s much more likely they pulled photos from the yearbook and told the AI to build porn around it.

          sent one of the altered images to a different chat room that consisted of other Lancaster Country Day School students, apparently in error,”

          That doesn’t sound like intent to distribute, that sounds like someone made a HUGE mistake. Like sending an embarrassing text to the wrong person. They were both playing around with an AI porn generator, and showing each other their creations. Someone let the secret slip and now their lives are ruined.

          • subignition
            link
            fedilink
            520 days ago

            The “showing each other their creations” is distribution regardless of whether or not it was in private.

            Hundreds? You think they had nude photos of hundreds of students? That’s not plausible.

            Commenting without even reading my entire post? The article literally states “police found 347 images and videos.”

            • Rhynoplaz
              link
              English
              3
              edit-2
              19 days ago

              I’m sorry. While reading that post, I misread and thought that they were claiming that the students had ACTUAL NUDE photos of hundreds of students and were using the AI to make them MORE graphic.

              I was arguing that having that many nudes to begin with was implausible.

              I understand that they collected hundreds of publicly available photos and ran them through a porn AI, which resulted in hundreds of nude drawings.

            • AwesomeLowlander
              link
              fedilink
              English
              119 days ago

              Not defending anybody here, just gonna touch on a single point. When dealing with AI generated images, ‘hundreds of images’ is the work of a single command and leaving it to run for an hour. Unlike Photoshopped images, the quantity here is fairly meaningless.

              • subignition
                link
                fedilink
                119 days ago

                Not in the eyes of the law it isn’t.

                Separately… we don’t know how much variation there was in the source images. There is a lot of difference between your hypothetical fire-and-forget and the hypothetical on the other end, where the illegal images are mostly comprised of unique source images.

                It’s all hair-splitting, because at the end of the day, between the accused, their parents, and the environment around them, these kids should have been taught better than to do this.

                • AwesomeLowlander
                  link
                  fedilink
                  English
                  119 days ago

                  Yes, I know the law doesn’t care how they were generated. It was more just bringing up a point of consideration in the discussion.

                  Even unique source images don’t mean much. If you have the know how, it’s one script to scrape the hundreds of images and a second one to modify them all.

                  Again, not defending the kids. I’m just adding a technical perspective to the discussion

          • @zoostation
            link
            English
            120 days ago

            You think they had nude photos of hundreds of students? That’s not plausible.

            Well sure, you can take any conclusion you want from this article if you’re freely able to pick and choose without evidence which sentences you think “feel” wrong.

            In the age of social media you think it’s not trivial to find hundreds of pictures of classmates on their social media? Fuck off if you’re bending over backwards to find a way to defend the deepfake shitheads instead of the victims.

            • Rhynoplaz
              link
              English
              219 days ago

              I misread. I do understand that they found hundreds of publicly available photos and turned them into fake nude drawings.

      • @Serinus
        link
        English
        519 days ago

        Sorry, but this is a particularly terrible case of kids being interested in kids their own age, and deserves the death penalty. Can we still crucify them?

        This should be a crime, and there are victims. But it’s not on the same level as something like revenge porn or actual CSAM. It’s damaging for the victims, but it’s a hell of a lot less damaging when you can legitimately say, “that’s not me”.

        We need to not be hysterical and find a reasonable middle ground. There needs to be consequences, but not so harsh that these boys can’t have a normal life.

  • @athairmor
    link
    English
    119 days ago

    Private, mostly rich-kid school. Will be interesting to see how this plays out.