• @fraval
    link
    English
    182 years ago

    I am not saying that this is a good thing, but rather generated by AI than the real thing… Still fuckedup though.

    • @CitizenKong
      link
      English
      72 years ago

      The really sick aspect about this is that someone fed the AI with probably thousands of real child porn images to generate the fake ones.

      • @Protegee9850
        link
        English
        -12 years ago

        That’s a hell of a leap and seems to be based in ignorance of the technology.

        • @LufyCZ
          link
          English
          22 years ago

          Your comment is based in ignorance of the technology. To have AI spit out images of a specific type, you also have to first feed it imagines of said type.

          • @Protegee9850
            link
            English
            7
            edit-2
            2 years ago

            Again, you’re obviously ignorant of how this stuff actually works. That is simply not the case. Otherwise the training set would necessarily need to have images of every type that you hope to generate, an impossibility and which obviously isn’t the case - a very quick look at some of the crazier things people have generated disprove it. Training the model on nude and clothed images of adults and clothed images of children - as others have pointed out - would allow you to generate nude images of children. Could a model have been fine tuned with CSAM - yes; but it’s certainly not a given, and probably not necessary.

            The stable diffusion sub has somewhat migrated over to the fediverse. You can find more information about how this stuff actually works beyond your introductory understanding of the concept there.

        • @ToastyWaffle
          link
          English
          12 years ago

          How do you think AI machine learning works? It’s all based on Large Language Models aka a shit ton of real data.

      • @mido
        link
        English
        10
        edit-2
        2 years ago

        Not (necessarily) with real naked children. With kids with clothes, adults with clothes, and naked adults.

        It’s not hard for the AI to transpose from a clothed child to a naked one, it’s basically the same thing as switching your gender or making you look old

    • Haakon
      link
      fedilink
      English
      11 year ago

      It’s equally illegal, at least in my country.

    • @MercuryUprising
      link
      English
      -12 years ago

      Its still fucked up, because it starts with shit like this before moving into real world encounters. Over time the predators brain will see it as a new normal and will want to escalate.

      • @Noedel
        link
        English
        62 years ago

        Do you know this or do you think this? I’m relieved I don’t have much insight into the mind of pedos… But couldn’t it be the other way around too?

    • @trachemys
      link
      English
      -22 years ago

      You’ll still go to jail, even if it is fake, even if it is a cartoon character.

      • @phx
        link
        English
        62 years ago

        Depends on the country. Some only criminalize depictions where children were exploited or harmed, so the cartoon stuff might get a pass despite being nasty. AI images I’d imagine it might be hard to prove they aren’t real children and at that point might be treated like a robbery with a fake weapon or selling fake drugs (still chargeable as the real thing in most places)

        • @trachemys
          link
          English
          32 years ago

          The bart simpson porn case I think was UK.