• @RGB3x3
    link
    English
    -2
    edit-2
    1 month ago

    Are there any guarantees that harmful images weren’t used in these AI models? Based on how image generation works now, it’s very likely that harmful images were used to train the data.

    And if a person is using a model based on harmful training data, they should be held responsible.

    However, the AI owner/trainer has even more responsibility in perpetuating harm to children and should be prosecuted appropriately.

    • @[email protected]
      link
      fedilink
      English
      121 month ago

      And if a person is using a model based on harmful training data, they should be held responsible.

      I will have to disagree with you for several reasons.

      • You are still making assumptions about a system you know absolutely nothing about.
      • By your logic anything born from something that caused suffering from others (this example is AI trained on CSAM) the users of that product should be held responsible for the crime committed to create that product.
        • Does that apply to every product/result created from human suffering or just the things you don’t like?
        • Will you apply that logic to the prosperity of Western Nations built on the suffering of indigenous and enslaved people? Should everyone who benefit from western prosperity be held responsible for the crimes committed against those people?
        • What about medicine? Two examples are The Tuskegee Syphilis Study and the cancer cells of Henrietta Lacks. Medicine benefited greatly from these two examples but crimes were committed against the people involved. Should every patient from a cancer program that benefited from Ms. Lacks’ cancer cells also be subject to pay compensation to her family? The doctors that used her cells without permission didn’t.
        • Should we also talk about the advances in medicine found by Nazis who experimented on Jews and others during WW2? We used that data in our manned space program paving the way to all the benefits we get from space technology.
      • @PotatoKat
        link
        English
        -11 month ago

        The difference between the things you’re listing and SAM is that those other things have actual utility outside of getting off. Were our phones made with human suffering? Probably but phones have many more uses than making someone cum. Are all those things wrong? Yea, but at least good came out of it outside of just giving people sexual gratification directly from the harm of others.

      • @aceshigh
        link
        English
        -131 month ago

        The topic that you’re choosing to focus on really interesting. what are your values?

        • @[email protected]
          link
          fedilink
          English
          21 month ago

          My values are none of your business. Try attacking my arguments instead of looking for something about me to attack.

          • @aceshigh
            link
            English
            -1
            edit-2
            1 month ago

            At the root of it beliefs aren’t based on logic they’re based on your value system. So why dance around the actual topic?

      • @gardylou
        link
        English
        -141 month ago

        LOL, that’s a lot of bullshit misdirection to defend AI child porn. Christ, can there be one social media like platform that just has normal fucking people.

        • @Cryophilia
          link
          English
          01 month ago

          If everywhere you go, everyone is abnormal, I have news for you

          • @gardylou
            link
            English
            11 month ago

            If everyone you go, everyone you know thinks AI generated child sex stuff is normal, well buddy, I think I’ve got some news for you.

    • @aesthelete
      link
      English
      1
      edit-2
      1 month ago

      Are there any guarantees that harmful images weren’t used in these AI models?

      Lol, highly doubt it. These AI assholes pretend that all the training data randomly fell into the model (off the back of a truck) and that they cannot possibly be held responsible for that or know anything about it because they were too busy innovating.

      There’s no guarantee that most regular porn sites don’t contain csam or other exploitative imagery and video (sex trafficking victims). There’s absolutely zero chance that there’s any kind of guarantee.