A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • @9bananas
    link
    116 days ago

    generally a very good point, however i feel it’s important to point out some important context here:

    the pedophiles you’re talking about in your comment are almost always members of tight knit communities that share CSAM, organize distribution, share sources, and most importantly, indulge their fantasies/desires together.

    i would think that the correlation that leads to molestation is not primarily driven by the CSAM itself, but rather the community around it.

    we clearly see this happening in other similarly structured and similarly isolated communities: nazis, incels, mass shooters, religious fanatics, etc.

    the common factor in radicalization and development of extreme views in all these groups is always isolation and the community they end up joining as a result, forming a sort of parallel society with it’s own rules and ideals, separate from general society. over time people in these parallel societies get used to seeing the world in a way that aligns with the ideals of the group.

    nazis start to see anyone not part of their group as enemies, incels start to see “females” instead of women, religious fanatics see sinners…and pedophiles see objects that exist solely for their gratification instead of kids…

    I don’t see why molesters should be any different in this aspect, and would therefore argue that it’s the communal aspect that should probably be the target of the law, i.e.: distribution and organization (forums, chatrooms, etc.)

    the harder it is for them to organize, the less likely these groups are to produce predators that cause real harm!

    if on top of that there is a legally available outlet where they can indulge themselves in a safe manner without harming anyone, I’d expect rates of child molestation to drop significantly, because, again, there’s precedence from similar situations (overdoses in drug addicts, for example)

    i think it is a potentially fatal mistake to think of pedophiles as “special” cases, rather than just another group of outcasts, because in nearly all cases of such pariahs the solutions that prove to work best in the real world are the ones that make these groups feel less like outcasts, which limits avenues of radicalization.

    i thought these parallels are something worth pointing out.