A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • @emmy67
    link
    -117 days ago

    Are you stupid? Something has to be in the training model for any generation to be possible. This is just a new way to revitalise kids

    • @[email protected]
      link
      fedilink
      3
      edit-2
      17 days ago

      So are you suggesting they can get an unaltered facial I.D. of the kids in the images? —Because that makes it regular csam with a specific victim (as mentioned), not an ai generative illustration.

      • @emmy67
        link
        -517 days ago

        No, I am telling you csam images can’t be generated by an algorithm that hasn’t trained on csam

        • @[email protected]
          link
          fedilink
          5
          edit-2
          17 days ago

          That’s patently false.

          I’m not going to continue to entertain this discussion but instead I’m just going to direct you to the multiple other people who have already effectively disproven this argument and similar arguments elsewhere in this post’s discusion. Enjoy.

          • @emmy67
            link
            017 days ago

            Also if you’d like to see how the corn dog comment is absurd and wrong. Go look up my comment.

          • @emmy67
            link
            -117 days ago

            Sure thing bud. Sure thing 🙄

    • @ameancow
      link
      English
      215 days ago

      Not necessarily, AI can do wild things with combined attributes.

      That said, I do feel very uncomfortable with the amount of defense of this guy, he was distributing this to people. If he was just generating fake images of fake people using legal training data in his own house for his own viewing, that would be a different story. The amount of people jumping in front of the bullet for this guy when we don’t really know the details is the larger problem.