• @CitizenKong
    link
    English
    71 year ago

    The really sick aspect about this is that someone fed the AI with probably thousands of real child porn images to generate the fake ones.

    • @Protegee9850
      link
      English
      -11 year ago

      That’s a hell of a leap and seems to be based in ignorance of the technology.

      • @LufyCZ
        link
        English
        21 year ago

        Your comment is based in ignorance of the technology. To have AI spit out images of a specific type, you also have to first feed it imagines of said type.

        • @Protegee9850
          link
          English
          7
          edit-2
          1 year ago

          Again, you’re obviously ignorant of how this stuff actually works. That is simply not the case. Otherwise the training set would necessarily need to have images of every type that you hope to generate, an impossibility and which obviously isn’t the case - a very quick look at some of the crazier things people have generated disprove it. Training the model on nude and clothed images of adults and clothed images of children - as others have pointed out - would allow you to generate nude images of children. Could a model have been fine tuned with CSAM - yes; but it’s certainly not a given, and probably not necessary.

          The stable diffusion sub has somewhat migrated over to the fediverse. You can find more information about how this stuff actually works beyond your introductory understanding of the concept there.

      • @ToastyWaffle
        link
        English
        11 year ago

        How do you think AI machine learning works? It’s all based on Large Language Models aka a shit ton of real data.