• @[email protected]
    link
    fedilink
    English
    334 months ago

    It would not need to be trained on CP. It would just need to know what human bodies can look like and what sex is.

    AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

    • @[email protected]
      link
      fedilink
      English
      34 months ago

      AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

      Local model go brrrrrr

      • @fidodo
        link
        English
        434 months ago

        You can ask it to make an image of a man made of pizza. That doesn’t mean it was trained on images of that.

        • @dustyData
          link
          English
          -104 months ago

          But it means that it was trained on people and on pizza. If it can produce CSAM, it means it had access to pictures of naked minors. Even if it wasn’t in a sexual context.

          • @bitwaba
            link
            English
            74 months ago

            Minors are people. It knows what clothed people of all ages look like. It also knows what naked adults look like. The whole point of AI is that it can fill in the gaps and create something it wasn’t trained on. Naked + child is just a simple equation for it to solve

      • @MeanEYE
        link
        English
        324 months ago

        You can always tell when someone has no clue about AI but has read online about it.

      • @herrvogel
        link
        English
        94 months ago

        The whole point of those generative models that they are very good at blending different styles and concepts together to create coherent images. They’re also really good at editing images to add or remove entire objects.

      • @mightyfoolish
        link
        English
        6
        edit-2
        4 months ago

        I think @[email protected] meant was the AI could be trained on what sex is and what children are at different points. Then a user request could put those two concepts together.

        But as the replies I got show, there were multiple ways this could have got accomplished. All I know is AI needs to go to jail.