A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • @[email protected]
    link
    fedilink
    112 months ago

    TV and movies should not be able to show crimes, because images depicting crimes should be illegal.

    (I’m just illustrating the slippery slope criminalizing “artistic” renderings)

    • @aesthelete
      link
      -6
      edit-2
      2 months ago

      I’m not advocating for what you’re saying here at all.

      So there you go, your slope now has gravel on it.

      EDIT: This dude was arrested using today’s laws, and I’m pretty sure the series Dexter is still legal to write, direct, film, and view. So your slippery slope is a fallacious one (as most of them tend to be in my experience).

        • @aesthelete
          link
          -52 months ago

          Sure, with some exceptions and reasonable definitions.

            • @aesthelete
              link
              -6
              edit-2
              2 months ago

              Never said, wrote, or even thought any such thing.

                • @aesthelete
                  link
                  -7
                  edit-2
                  2 months ago

                  CSAM is the exception, Socrates. Also as far as definitions go, computer models aren’t artists.

      • @Cryophilia
        link
        42 months ago

        Why should this be illegal?

        Because it’s illegal.

        • @aesthelete
          link
          -2
          edit-2
          2 months ago

          It should be illegal for a number of reasons. One is a simple practical one: as the technology advances towards increasing levels of realism it’ll become impossible for law enforcement to determine what material is “regular” CSAM versus what material is “generated” CSAM.

          So, unless you’re looking to repeal all laws against possession of CSAM, you’ll have a difficult time crafting a cut-out for generated CSAM.

          And honestly, why bother? What’s the upside here? To have pedos get a more fulfilling wank to appease them and hope they won’t come after your kids for real? I really doubt the premise behind that one.

          • @Cryophilia
            link
            62 months ago

            And honestly, why bother? What’s the upside here?

            Allowing for victimless crimes simply because a group is undesirable is a terrible precedent. We can’t ban things just because they make us uncomfortable. Or because it makes law enforcements job easier.

            • @aesthelete
              link
              -42 months ago

              Allowing for victimless crimes simply because a group is undesirable is a terrible precedent.

              I wouldn’t even call it victimless, and we have all kinds of actual victimless crimes that are already illegal so I don’t care about supposedly setting this “precedent” that has already been set a million times over.

              • @Cryophilia
                link
                52 months ago

                and we have all kinds of actual victimless crimes that are already illegal

                And we should be undoing those laws

                • @aesthelete
                  link
                  -3
                  edit-2
                  2 months ago

                  We aren’t though, so it’s frankly pretty odd that you’re fixated on this one.

                  It’s frankly pretty odd that Lemmy in general seems to be fairly pro-generated-CSAM. I’m betting you guys are just afraid of the feds finding your stashes.

                  EDIT: I basically get maybe three replies a week to things I post on here, except when I post something about being okay with generated CSAM or deepfake porn being illegal (in which case I get binders full of creeps in my inbox).

                  • @Cryophilia
                    link
                    62 months ago

                    There it is. “If you disagree with me you’re a pedo”. You people always go back to that. I bet I could convince you to hack off your own arm as long as I said anyone with a left arm is a pedo.