There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • @Adalast
    link
    51 year ago

    Info link: https://pubmed.ncbi.nlm.nih.gov/18686026/

    The DSM-V specifies 2 types of pedophilia, Pedophilic (victim age <11) and Hebophilic (victim ages 11-14). What you are describing for the grooming is generally not pedophilia because “children” older than 15 are generally considered post-pubescent and thus anatomically adults. Their frontal lobes still have a LOT of time needed to cook to completion, but they have the impulse control issues for a reason, from an evolutionary standpoint. Yes, in modern society, “adults” who take advantage of the still-developing prefrontal cortex of a post-pubescent adolescent is a shit human being who doesn’t deserve to be a member of society, but they are technically not pedophiles, at least not clinically. Legally is a different story, but that is not a pertinent area of discussion right now.

    Pedophilic and Hebophilic individuals generally do not ever take their impulses to the realm of reality. Most of them actually end up feeling so much shame and remorse over even having the thoughts that they commit suicide. They definitely deserve pity and treatment, not stigmatization and ostracization.

    As to the OP asking about AI art that depicts underage individuals in states of undress or sexual situations, ALL depictions of underage individuals in those contexts are illegal. By the letter of the law, if you draw stick figures on a piece of paper having sex, then label them as children, you have created child pornography. No depiction is legal, no matter the medium. AI-generated, hand drawn, sculpted, watercolors, photos, under the law in (I believe) every state, they are all identical. Personally, I believe that this is asinine and 100% indicates that the purpose of these laws are to adjudicate morality, not “protect the children” as all of the people who push on them claim, but that is just my opinion. Hand-drawn artwork that has no photographic source material and does not depict real people has virtually 0 chance of having caused harm to any children, and AI just knows what the keywords mean in the context of reversing the vaporization of an image. They weren’t trained on kiddy porn, the we’re trained on pictures of children, and pictures of adults doing their porny thing, so they are able to synthesize the two concepts together.