‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @cosmicrookie
    link
    English
    10
    edit-2
    6 months ago

    Hoe How can this be legal though?

    • @DoYouNot
      link
      English
      326 months ago

      Missing a ‘w’ or a comma.

    • @PopOfAfrica
      link
      English
      176 months ago

      Obviously not defending this, I’m just not sure how it wouldn’t be legal. Unless you use it to make spurious legal claims.

      • @cosmicrookie
        link
        English
        76 months ago

        I live in a Scandinavian country, and it is illigal to make and distributed fake (and real) nudes of people without their permission. I expect this to be the same in many other developed countries too.

        • @hansl
          link
          English
          46 months ago

          I’m curious. If I was to paint you using my memory, but naked, would that still be illegal? How realistic can I paint before I trespass the law? I’m fairly sure stick figures are okay.

          And do you mean that even just possessing a photo without consent is illegal? What if it was sent by someone who has consent but not to share? Is consent transitive according to the law?

          AI pushes the limit of ethics and morality in ways we might not be ready to handle.

          • @cosmicrookie
            link
            English
            16 months ago

            I am pretty sure that possesion is not illigal but that distribution without consent is. The idea is that someone can have sent you their nude, but you’d get charged if you share it with others.

            There was a huge case here, where over 1000 teens were charged for distributing child porn, because of a video that cirvulated among them of some other teens having sex. So basically someone filmed a young couple having sex at a party i believe. That video got shared on Facebook messenger. Over 1000 teens got sued. I believe that 800 were either fined or jailed

            Here’s an article you may be able to run through Google translate

            https://jyllands-posten.dk/indland/ECE13439654/naesten-500-doemt-for-boerneporno-i-kaempe-sag-om-unges-deling/

            • Encrypt-Keeper
              link
              English
              36 months ago

              In some states, distributing nude content of anyone, including one’s self, with consent, electronically is illegal. Which sounds insane because it is. It’s one of those weird legacy laws that never ever never gets enforced for obvious reasons, but I actually know a guy arrested for it, because he got in the wrong side of some police and it was just the only law they could find that he “broke”.

    • @phoneymouse
      link
      English
      66 months ago

      I guess free speech laws protect it? You can draw a picture of someone else nude and it isn’t a violation of the law.

      • @cosmicrookie
        link
        English
        36 months ago

        But its not. That is not legal.

        I dont know if it is where you live, but here (Scandinavian Country) and many other places around the World, it is illigal to create fske nudes of people without their permission

        • @[email protected]
          link
          fedilink
          English
          56 months ago

          Ah didn’t know that, AFAIK it’s protected artistic speech in the US. Not to say that it’s right but that’s probably why it’s still a thing.

          • @[email protected]
            link
            fedilink
            English
            26 months ago

            In principle that’s the case in Germany, too, but only if the person is of public interest (otherwise you’re not supposed to publish any pictures of them where they are the focus of the image) and, secondly, it has to serve actually discernible satire, commentary, etc. Merely saying “I’m an artist and that’s art” doesn’t fly, hire a model. Similar to how you can dish out a hell a lot of insults when you’re doing a pointed critique, but if the critique is missing and it’s only abuse that doesn’t fly.

            Ha. Idea: An AfD politician as a garden gnome peeing into the Bundestag.

        • @TotallynotJessica
          link
          English
          26 months ago

          Appreciate how good you have it. In America, child sex abuse material is only illegal when children were abused in making it, or if it’s considered obscene by a community. If someone edits adult actors to look like children as they perform sex acts, it’s not illegal under federal law. If someone generates child nudity using ai models trained on nude adults and only clothed kids, it’s not illegal at the national level.

          Fake porn of real people could be banned for being obscene, usually at a local level, but almost any porn could be banned by lawmakers this way. Harmless stuff like gay or trans porn could be banned by bigoted lawmakers, because obscenity is a fairly subjective mechanism. However, because of our near absolute freedom of speech, obscenity is basically all we have to regulate malicious porn.

          • @[email protected]
            link
            fedilink
            English
            16 months ago

            child sex abuse material is only illegal when children were abused in making it

            This is literally why it’s illegal though. Because children are abused, permanently traumatized, or even killed in its making. Not because it disgusts us.

            There are loads of things that make me want to be sick, but unless they actively hurt someone they shouldn’t be illegal.

          • @cosmicrookie
            link
            English
            1
            edit-2
            6 months ago

            The way I believe it is here, is that it is illigal to distribute porn or nudes without consent, be it real or fake. I don’t know how it is with AI generated material of purely imaginary people. I don’t think that that is iligal. but if it is made to look like someone particular, then you can get sued.