There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • Wugmeister
    link
    fedilink
    English
    111 year ago

    There are two parts to this problem.

    For kids who haven’t hit puberty, there is a diagnosable pedophilia disorder. This is mostly genetics. (I’m pretty sure I’ve met an alpaca that was a pedophile once.) The molester’s brain is wired wrong. Nothing to do about that. IMHO, they deserve pity as long as they keep their hands off the children.

    For teenagers, the attraction is the power dynamic. Teens have a rather distorted view on what is attractive, and they tend to be naive and easily manipulated. On top of this, almost all teenagers have next to no impulse control, and many will make very very bad decisions (even knowing that the decision is bad) if doing so might result in some form of dopamine hit via sex/adrenaline rush/video games/peer approval/etc. Adults that seek out teenagers for sexual relationships are bad people who chose to be a groomer. There is no genetic component to being a groomer, and they don’t deserve pity.

    Btw, I can flesh out my claim about the alpaca if you want, but it will have to have a tw for adorable fluffy animals suffering a horrifically slow and painful death.

    • @Adalast
      link
      51 year ago

      Info link: https://pubmed.ncbi.nlm.nih.gov/18686026/

      The DSM-V specifies 2 types of pedophilia, Pedophilic (victim age <11) and Hebophilic (victim ages 11-14). What you are describing for the grooming is generally not pedophilia because “children” older than 15 are generally considered post-pubescent and thus anatomically adults. Their frontal lobes still have a LOT of time needed to cook to completion, but they have the impulse control issues for a reason, from an evolutionary standpoint. Yes, in modern society, “adults” who take advantage of the still-developing prefrontal cortex of a post-pubescent adolescent is a shit human being who doesn’t deserve to be a member of society, but they are technically not pedophiles, at least not clinically. Legally is a different story, but that is not a pertinent area of discussion right now.

      Pedophilic and Hebophilic individuals generally do not ever take their impulses to the realm of reality. Most of them actually end up feeling so much shame and remorse over even having the thoughts that they commit suicide. They definitely deserve pity and treatment, not stigmatization and ostracization.

      As to the OP asking about AI art that depicts underage individuals in states of undress or sexual situations, ALL depictions of underage individuals in those contexts are illegal. By the letter of the law, if you draw stick figures on a piece of paper having sex, then label them as children, you have created child pornography. No depiction is legal, no matter the medium. AI-generated, hand drawn, sculpted, watercolors, photos, under the law in (I believe) every state, they are all identical. Personally, I believe that this is asinine and 100% indicates that the purpose of these laws are to adjudicate morality, not “protect the children” as all of the people who push on them claim, but that is just my opinion. Hand-drawn artwork that has no photographic source material and does not depict real people has virtually 0 chance of having caused harm to any children, and AI just knows what the keywords mean in the context of reversing the vaporization of an image. They weren’t trained on kiddy porn, the we’re trained on pictures of children, and pictures of adults doing their porny thing, so they are able to synthesize the two concepts together.