• Rhynoplaz
    link
    English
    -320 days ago

    This particular article was paywalled, but I found another one. The difference in phrasing may have given a different impression, but I think you may be reaching on some of your points.

    hundreds of images of Lancaster Country Day School students, which they digitally altered using an artificial intelligence application to make the images appear pornographic,”

    Hundreds? You think they had nude photos of hundreds of students? That’s not plausible. It’s much more likely they pulled photos from the yearbook and told the AI to build porn around it.

    sent one of the altered images to a different chat room that consisted of other Lancaster Country Day School students, apparently in error,”

    That doesn’t sound like intent to distribute, that sounds like someone made a HUGE mistake. Like sending an embarrassing text to the wrong person. They were both playing around with an AI porn generator, and showing each other their creations. Someone let the secret slip and now their lives are ruined.

    • subignition
      link
      fedilink
      520 days ago

      The “showing each other their creations” is distribution regardless of whether or not it was in private.

      Hundreds? You think they had nude photos of hundreds of students? That’s not plausible.

      Commenting without even reading my entire post? The article literally states “police found 347 images and videos.”

      • Rhynoplaz
        link
        English
        3
        edit-2
        20 days ago

        I’m sorry. While reading that post, I misread and thought that they were claiming that the students had ACTUAL NUDE photos of hundreds of students and were using the AI to make them MORE graphic.

        I was arguing that having that many nudes to begin with was implausible.

        I understand that they collected hundreds of publicly available photos and ran them through a porn AI, which resulted in hundreds of nude drawings.

      • AwesomeLowlander
        link
        fedilink
        English
        119 days ago

        Not defending anybody here, just gonna touch on a single point. When dealing with AI generated images, ‘hundreds of images’ is the work of a single command and leaving it to run for an hour. Unlike Photoshopped images, the quantity here is fairly meaningless.

        • subignition
          link
          fedilink
          119 days ago

          Not in the eyes of the law it isn’t.

          Separately… we don’t know how much variation there was in the source images. There is a lot of difference between your hypothetical fire-and-forget and the hypothetical on the other end, where the illegal images are mostly comprised of unique source images.

          It’s all hair-splitting, because at the end of the day, between the accused, their parents, and the environment around them, these kids should have been taught better than to do this.

          • AwesomeLowlander
            link
            fedilink
            English
            119 days ago

            Yes, I know the law doesn’t care how they were generated. It was more just bringing up a point of consideration in the discussion.

            Even unique source images don’t mean much. If you have the know how, it’s one script to scrape the hundreds of images and a second one to modify them all.

            Again, not defending the kids. I’m just adding a technical perspective to the discussion

    • @zoostation
      link
      English
      120 days ago

      You think they had nude photos of hundreds of students? That’s not plausible.

      Well sure, you can take any conclusion you want from this article if you’re freely able to pick and choose without evidence which sentences you think “feel” wrong.

      In the age of social media you think it’s not trivial to find hundreds of pictures of classmates on their social media? Fuck off if you’re bending over backwards to find a way to defend the deepfake shitheads instead of the victims.

      • Rhynoplaz
        link
        English
        219 days ago

        I misread. I do understand that they found hundreds of publicly available photos and turned them into fake nude drawings.