• @[email protected]
    link
    fedilink
    English
    16 months ago

    I don’t think people who uploaded pictures on Facebook consider that making it available for personal use.

    Then they shouldn’t have uploaded it to Facebook and made it publicly accessible.

    Just because something is made illegal doesn’t make it actively pursued, it just makes it so if someone gets caught doing it or gets reported doing it they can be stopped.

    It’s the next logical step for the pearl clutchers and amounts to “thought crime.”

    These people aren’t doing anything to my children, they’re making their own images from images they have a right to use. It’s super creepy and I’d probably pick a fight with them if I found out, but I don’t think it should be illegal if there’s no victim.

    The geek squad worker could still report these people, and it would be the prosecution’s job to prove that they were acquired or created in an illegal way.

    Do you think it’s okay for someone to have real csam?

    No, because that increases demand for child abuse. Those pictures are created by abuse of children, and having getting access to them encourages for child abuse to produce more content.

    Possession itself isn’t the problem, the problem is how they’re produced.

    I feel similarly about recreational drugs. Buying from dealers is bad because it encourages snuggling and everything related to it. I have no problem with weed or whatever, I have problems with the cartels. At least with drugs there’s a simple solution: legalize it. I likewise want a legal avenue for these people who would otherwise participate in child abuse to not abuse children. Them looking at creepy AI content generated from pictures of my child doesn’t hurt my child, just don’t share those images or otherwise let me know about it.

    • @PotatoKat
      link
      English
      16 months ago

      It’s the next logical step for the pearl clutchers and amounts to “thought crime.”

      I seriously doubt they would create any more surveillance for that than there already is for real CSAM.

      The geek squad worker could still report these people, and it would be the prosecution’s job to prove that they were acquired or created in an illegal way.

      That would just make it harder to prosecute people for CSAM since they will all claim their material was just ai. That would just end up helping child abusers get away with it.

      Possession itself isn’t the problem, the problem is how they’re produced.

      I think the production of generated CSAM is unethical because it still involves photos of children without their consent

      No, because that increases demand for child abuse. Those pictures are created by abuse of children, and having getting access to them encourages for child abuse to produce more content.

      There is evidence to suggest that viewing csam increases child seeking behavior. So them viewing generated CSAM would most likely have the same if not a similar result. That would mean that even just having access to the materials would increase the likelihood of child abuse

      https://www.theguardian.com/global-development/2022/mar/01/online-sexual-abuse-viewers-contacting-children-directly-study

      The survey was self reported so the reality is probably higher than the 42% cited from the study

      I likewise want a legal avenue for these people who would otherwise participate in child abuse to not abuse children.

      The best legal avenue for non-offending pedophiles to take is for them to find a psychologist that can help them work through their desires. Not to give them a thing that will make them want to offend even more.

      • @[email protected]
        link
        fedilink
        English
        16 months ago

        That would just make it harder to prosecute people for CSAM

        That’s true, and an unfortunate part of preserving freedoms. That said, if someone is actually abusing children on the regular, police have a way of tracking that individual to catch them: investigations.

        I wish police had to do them more often instead of leaving that job to the prosecution. If that means we need to pull officers away from other important duties like arresting black men for possessing a joint or pulling people over for speeding on an empty highway, I guess that’s what we have to do.

        it still involves photos of children without their consent

        It involves legally acquired images and is protected under “fair use” laws. You don’t need my permission to exercise your fair use rights, even if I think your use is disgusting. It’s not my business. But if you make it my business (i.e. you tell me), I may choose to assault you and hope the courts will side with me that they constitute “fighting words.”

        Just because something is disgusting doesn’t make it illegal.

        As for that article:

        “This is really significant. We now have a peer-reviewed study to prove that watching [CSAM] can increase the risk of contact.”

        It doesn’t prove anything, what it does is draw a correlation between people who search for CSAM on the dark web and are willing to answer a survey (a pretty niche group) and self-reported inclination to contact children. Correlation isn’t proof, it’s correlation.

        That said, I don’t know if a better study could or should be conducted. Maybe survey people caught contacting children (sting operations) and those caught just distributing CSAM w/o child contact. We need go know the difference between those who progress to contact and those who don’t, and I don’t think this survey provides that.

        find a psychologist that can help them work through their desire

        I agree, and I think that should be widely accessible.

        That said, I don’t think giving people a criminal record helps. If they need to be locked up to protect the public (i.e. there are actual victims), then let’s lock them up. But otherwise, we absolutely shouldn’t. Let’s make help available and push people toward getting that help.