• @flossdaily
    link
    English
    321 year ago

    Seems like this could actually be a good thing, though, since it could destroy the market for the genuine article?

    Couldn’t this lead to a REDUCTION in the abuse of children?

    • @OscarRobin
      link
      English
      61 year ago

      One of the reasons that hentai etc of children is illegal in many regions is because abusers apparently use it to coerce children into performing the acts themselves, so it may not be created by abuse but may aid abusers. The same argument could be made against ‘AI’ images.

      • @BradleyUffner
        link
        English
        71 year ago

        That seems a little broad though…I mean, a baseball bat can also be used to coerce someone into things.

        • @OscarRobin
          link
          English
          31 year ago

          I agree it seems a bit of odd reasoning - especially when it’s not hard to find legal depictions that could presumably be used for coercion just as easily as the illegal cartoons etc.

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      1 year ago

      Sure, but it could also normalise images of children being raped and dramatically INCREASE the abuse of children. After all, widespread access to pornography didn’t cure the world of rapists.

      Why would it be any different if it’s AI generated images of children and why should we risk our kids so that some internet pedo can jerk off?

    • @jpeps
      link
      English
      -1
      edit-2
      1 year ago

      I’m no expert, but I think this is the wrong take. I see two potential issues here:

      • It can be difficult for people not to escalate behaviour. Say if sexual harrasment somehow became legal or permissable. What new percentage of sexual assault might start? CSAM can be just the beginning for some people.
      • This makes CSAM more easily available, and therefore likely means more people are accessing/generating it. See point one.
      • @FMT99
        link
        English
        51 year ago

        I think you’re edging towards thought crime territory here. No one is saying the real thing should be legal or the attraction should be socially acceptable.

        But if this reduces the market for the real thing I see that as a win.

        • @jpeps
          link
          English
          1
          edit-2
          1 year ago

          In my comment I’m stating that I think (again, not an expert) that it would grow the market and ultimately increase child abuse.

          Secondly on the subject of legality, of course it will depend on your country, but I believe in most places there is no distinction, at least currently, between ‘simulated’ CSAM and CSAM that directly involves the exploitation of minors. If you put a thought to paper, it’s not a thought anymore.

          I think that the attraction should more or less be socially acceptable. No one is choosing the feeling and they can’t be blamed for having those feelings. People need to be able to ask for help and receive support for what I imagine is an extremely tough thing to deal with. I do not think in the slightest that CSAM in any form, including simulated (for which hand drawn would also fit here), should be remotely socially acceptable.

    • @zepheriths
      link
      English
      -11 year ago

      I mean the issue is, will they remain a well adjusted individual if they have access to CP. It is well known the it is. Slope when it comes to CP. Frankly I think this just makes it easier to start the issue, which is not what we want

      • @FMT99
        link
        English
        41 year ago

        I don’t think people who are well adjusted are likely to go down this slope. Could you be convinced to download that stuff? If someone has that interest it’s there to begin with, not caused by looking at a picture.

    • FarraigePlaisteach
      link
      fedilink
      -21 year ago

      It could make it harder to find trafficked and abused children. Now they have to first separate the fake from the real. And, what a sickening job for the poor person taking on this responsibility.