• @mightyfoolish
    link
    English
    157 months ago

    Does this mean the AI was trained on CP material? How else would it know how to do this?

    • @[email protected]
      link
      fedilink
      English
      337 months ago

      It would not need to be trained on CP. It would just need to know what human bodies can look like and what sex is.

      AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

      • @[email protected]
        link
        fedilink
        English
        37 months ago

        AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

        Local model go brrrrrr

        • @fidodo
          link
          English
          437 months ago

          You can ask it to make an image of a man made of pizza. That doesn’t mean it was trained on images of that.

          • @dustyData
            link
            English
            -107 months ago

            But it means that it was trained on people and on pizza. If it can produce CSAM, it means it had access to pictures of naked minors. Even if it wasn’t in a sexual context.

            • @bitwaba
              link
              English
              77 months ago

              Minors are people. It knows what clothed people of all ages look like. It also knows what naked adults look like. The whole point of AI is that it can fill in the gaps and create something it wasn’t trained on. Naked + child is just a simple equation for it to solve

        • @MeanEYE
          link
          English
          327 months ago

          You can always tell when someone has no clue about AI but has read online about it.

        • @herrvogel
          link
          English
          97 months ago

          The whole point of those generative models that they are very good at blending different styles and concepts together to create coherent images. They’re also really good at editing images to add or remove entire objects.

        • @mightyfoolish
          link
          English
          6
          edit-2
          7 months ago

          I think @[email protected] meant was the AI could be trained on what sex is and what children are at different points. Then a user request could put those two concepts together.

          But as the replies I got show, there were multiple ways this could have got accomplished. All I know is AI needs to go to jail.

    • @joel_feila
      link
      English
      17 months ago

      Well some llm have been caught wirh cp in their training data

    • @ZILtoid1991
      link
      English
      17 months ago

      Likely yes, and even commercial models have an issue with CSAM leaking into their datasets. The scummiest of all of them likelyget one offline model, then add their collection of CSAM to it.