‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @cosmicrookie
    link
    English
    11 year ago

    I am pretty sure that possesion is not illigal but that distribution without consent is. The idea is that someone can have sent you their nude, but you’d get charged if you share it with others.

    There was a huge case here, where over 1000 teens were charged for distributing child porn, because of a video that cirvulated among them of some other teens having sex. So basically someone filmed a young couple having sex at a party i believe. That video got shared on Facebook messenger. Over 1000 teens got sued. I believe that 800 were either fined or jailed

    Here’s an article you may be able to run through Google translate

    https://jyllands-posten.dk/indland/ECE13439654/naesten-500-doemt-for-boerneporno-i-kaempe-sag-om-unges-deling/

    • Encrypt-Keeper
      link
      English
      31 year ago

      In some states, distributing nude content of anyone, including one’s self, with consent, electronically is illegal. Which sounds insane because it is. It’s one of those weird legacy laws that never ever never gets enforced for obvious reasons, but I actually know a guy arrested for it, because he got in the wrong side of some police and it was just the only law they could find that he “broke”.