• @[email protected]
    link
    fedilink
    English
    122 months ago

    Nobody except pedos will argue that child abuse is bad. AI cartoon porn in my opinion is fine. It’s a victimless crime. Literally nobody gets hurt. There is no studies confirming that someone who watches pedo cartoons will end up doing real life child abuse, in fact some studies show the opposite effect.

    I welcome AI porn. Cartoon or real life looking. Zero real women get taken advantage of and we get to pick whatever kink we want knowing that nobody was hurt in the making of the AI porn.

    • HubertManne
      link
      fedilink
      162 months ago

      I agree. Its always a tough stance. Its like ultimately I want nazis to be able to speak freely as long as they don’t actually do the stuff they spout. As far as im concerned when you try to ban stuff not in reality you are in the realm of trying to ban thought.

    • @bluespin
      link
      English
      -82 months ago

      It’s not victimless. It normalizes the sexualization of children.

      • @[email protected]
        link
        fedilink
        English
        272 months ago

        I mean, Japan has had csam cartoons for decades. They have a lower CSA rate compared to the USA. Not saying it’s totally related, but it doesn’t seem like if someone has access to cartoon csam they will normalize it and do it in real life.

      • Lightor
        link
        English
        42 months ago

        Sure, the same way video games normalize stealing cars. Or the same way movies normalize killing people. I mean at some point you gotta stop blaming media.

      • @Matriks404
        link
        English
        12 months ago

        It is already normalized.

      • @[email protected]
        link
        fedilink
        English
        -82 months ago

        So if legalized porn reduces rapes as studies show, how to we figure out if this existing allows for less abuse to kids, or if it spawns long term interest

        • @[email protected]
          link
          fedilink
          English
          182 months ago

          Cartoon csam has been legal in Japan for decades. They have a lower CSA per Capita than the USA.

          There are some brain studies that show the area of the brain that is responsible for caring for children is butt up next to the part of the brain that is responsible for sexual pleasures. The study suggests that there might be a misfiring of the synapse between the different sides of the brain that might cause someone to be a pedo. These people don’t experience sexual pleasures without thinking about kids. It’s literally a disability.

          My opinion is that we don’t know if removing AI generated csam would make things worse for real life kids or not. But flat out banning it without proper research would be irresponsible.

          • BlackLaZoR
            link
            fedilink
            102 months ago

            But flat out banning it without proper research would be irresponsible.

            I think the whole argument is moot. AI image generation is available to pretty much everyone. It’s impossible to control what what people are doing with it

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              2 months ago

              Maybe if self hosted, but if the AI is hosted by someone else… I imagine it would be as easy as key words being reported/flagged

              • BlackLaZoR
                link
                fedilink
                102 months ago

                Self hosting is trivial these days. Any modern NVIDIA card and hundreds of models available online.

          • @[email protected]
            link
            fedilink
            English
            7
            edit-2
            2 months ago

            Thanks for the thoughts on such, the way people were only downvoting originally and not providing any actual explanation to why, had me thinking it was just going to have been dumb to ask.

    • @Wogi
      link
      English
      -92 months ago

      The only people that sexualize children are pedophiles.

      • @[email protected]
        link
        fedilink
        English
        152 months ago

        If a pedophile sexualizes fake AI children in his basement but is a productive human in society and never acts in real life. Do you think this person deserves to be in jail?

        • @Wogi
          link
          English
          -152 months ago

          A: all models are trained on something

          2, you’re building your own straw man here. You’ve set up an extremely narrow condition under which this particular type of pedophilia is acceptable. Prove to me that that’s the norm, that it’s a typical use scenario, and that people looking at that crap are exclusively looking at loli, and not images meant to look like real people, and there’s a debate to be had there. But if you think any of that is true you’re lying to yourself. Sexualization of others is not going to happen in a vacuum under sterile conditions, it’s going to bleed in to real life.

          • @[email protected]
            link
            fedilink
            English
            142 months ago

            Prove to me that removing this will not bleed into real life even more than it is not? You can’t either.

            What I can prove is that Japan has csam cartoons for decades and they have less CSA per Capita than the USA. Is it possible that the Japanese know something we don’t? Who knows.

            Can you prove to me that the AI trained models were done with real csam materials? If so, not reporting this to the FBI seems irresponsible.

          • @ILikeTraaaains
            link
            English
            22 months ago

            Generative models does not work like that, if it were so, how do you explain that I can generate a picture of a purple six legged cat throwing lasers from the eyes in space?

            In a very very very simplified way, the models are trained that from noise it de noises it until the image is “restored”. A part of the model learns to remove noise until a drawing of a child is restored, another learns to restore the image of a drawing of a nude woman. Basically you say to the model that from noise it has to restore the drawing of a nude child it combines the two proceses (also it is trained to combine things in a way that makes sense).

      • @[email protected]
        link
        fedilink
        English
        72 months ago

        That’s like saying the only people who bake wedding cakes are bakers…

        I mean yeah. But what of it? Or are you already implying a level of abuse and connotation to the mental disorder?

        • @Klear
          link
          English
          02 months ago

          Bakers are people who bake for people who don’t bake for themselves…

        • @Wogi
          link
          English
          -72 months ago

          The act of baking does indeed make you a baker. Definitionally.

          Just because you aren’t going pro doesn’t mean you aren’t making a cake.

          • Lightor
            link
            English
            1
            edit-2
            2 months ago

            Me making box mix cake that tastes like ass doesn’t make me a baker. That’s silly.

      • @[email protected]
        link
        fedilink
        English
        22 months ago

        Sure but like I asked above, if porn reduces rapes, how do we know that this (gross) doesn’t reduce children being sexually assaulted. I can’t think of a single safe way it could be tested or monitored to find the better long term evil

    • @[email protected]
      link
      fedilink
      English
      -23
      edit-2
      2 months ago

      The “kink” you are picking is drawn child porn. I don’t care if nobody was directly hurt by your consumption of drawn child porn you are consuming child porn. You are a pedophile. Somebody attracted to children sexually.

      I don’t care if studies showing pedophiles who watch drawn child porn aren’t likely to offend. They are pedophiles. I know it’s a wild thing to state but I don’t like pedophiles. The debate on legality due to harm reduction is another thing all-together but at no point did I bring that up. I only asked that we not support or make AI porn of fictional children.

      Your support of a subset of child porn, particularly AI and drawn is noted though. Thank you for stating as much.

      • They are pedophiles. I know it’s a wild thing to state but I don’t like pedophiles.

        This makes sense and all, but a pedophile who hasn’t harmed a child hasn’t caused any harm. These people have a disorder that should be treated, but this isn’t always easy. If this can give them some outlet that prevents any actual harm being done to children, then that can easily be argued to be a net positive.

        I prefer these people jack off to AI porn over real child porn or worse, them turning to actual sexual abuse of children. What’s wrong with preventing child abuse?

        • @[email protected]
          link
          fedilink
          English
          -7
          edit-2
          2 months ago

          I would agree if we see some meta-analysis suggesting this but the evidence is small towards the effect. The studies you state in other comments are inconclusive, are not the majority, and only show mild effects. This is not scientific fact yet and all evidence shows a mild effect at best.

          Even if it did though they are still a pedophile. They are masturbating to child porn. We should not accept that as a positive thing and we should not support people who make child porn. These are the people who need to seek help most. If part of that help is jacking it to drawn child porn so be it but be it so under the care of a professional.

          The fact that one doesn’t offend only stops one from being a monster. A child molester, or child rapist. A pedophile is still immoral.

          My issue is that child porn is inherently wrong. It is a fundamental negative whether drawn or generated. Some things are not about material harm they are about base morality. Sexualizing children is a fundamental wrong.

          If the only thing stopping you from raping, molesting, or otherwise harming a child is drawn child porn you are not a good person. That is terrifying, and disgusting.

          Lastly, our brains are neuroplastic. Anyone can develop a fetish through constant exposure to something in a positive sexual setting. Something may disgust you, say poop, but if you jack off to the thought long enough you will develop a fetish. This, unlike the claim that drawn child porn is helpful, is well known. Harm to children or not this creates more pedophiles. People who think of children in a sexual manner

          • @[email protected]
            link
            fedilink
            English
            132 months ago

            No sane person is denying what you’re saying. With a Children of my own, I want to do anything and everything possible to protect them.

            That said, there are research that people who consume cartoon csam that haven’t done real life abuse. They have a problem. Taking away something that doesn’t hurt anyone might not improve our protection of our children, but make things worse.

      • @[email protected]
        link
        fedilink
        English
        122 months ago

        If a pedophile sexualizes fake AI children in his basement but is a productive human in society and never acts in real life. Do you think this person deserves to be in jail?