cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

    • @FluorideMind
      link
      English
      67 months ago

      It isn’t csam if there was no abuse.

      • @Jimmyeatsausage
        link
        English
        107 months ago

        It’s not child sexual assault if there was no abuse. However, the legal definition of csam is any visual depiction, including computer or computer-generated images of sexually explicit conduct, where […]— (A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct; (B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or © such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.

        You may not agree with that definition, but even simulated images that look like kids engaging in sexual activity meet the threshold for CSAM.

        • JackGreenEarth
          link
          fedilink
          English
          37 months ago

          Do you not know that CSAM is an acronym that stands for child sexual abuse material?

          • Possibly linux
            link
            fedilink
            English
            07 months ago

            True but CSAM is anything that involves minors. Its really up to the court to decide a lot of it but in the case above I’d imagine that the images were quite disturbing.

        • ASeriesOfPoorChoices
          link
          English
          17 months ago

          in this instance, no human children or minors of any kind were involved.

          • Possibly linux
            link
            fedilink
            English
            07 months ago

            I think the court looked at the phycological aspects of it. When you look at that kind of material you are training your brain and body to be attracted to that stuff in real life.

    • @Reddfugee42
      link
      English
      27 months ago

      We’re discussing the underpinnings and philosophy of the legality and your comment is simply “it is illegal”

      I can only draw from this that your morality is based on laws instead of vice versa.

      • Possibly linux
        link
        fedilink
        English
        07 months ago

        I’m in the camp if that there is no reason that you should have that kind of imagery especially AI generated imagery. Think about what people often do with pornography. You do not want them doing that with children regardless of if it is AI generated.

        • @Reddfugee42
          link
          English
          27 months ago

          What does want have to do with it? I’d rather trust science and psychologists to determine if this, which is objectively harmless, helps them control their feelings and gives them a harmless outlet.

          • Possibly linux
            link
            fedilink
            English
            17 months ago

            They aren’t banning porn in general. They just don’t want to create any more sexual desires toward children. The CSAM laws came from child protection experts. Admittedly some of these people want to “ban” encryption but that’s irrelevant in this case.