THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • @TheEighthDoctor
    link
    62 months ago

    So if I make AI porn of a celebrity but give her a face tattoo saying AI generated then its legal?

    • @[email protected]
      link
      fedilink
      English
      72 months ago

      Doubt it, a reasonable person will generally be able to tell if you’re obviously taking the piss with the law. Feel free to try it and let us know how you get on though.

      • @TheEighthDoctor
        link
        62 months ago

        But that is not what the bill says, the reasonable person is not evaluating my intent, it’s evaluating if the video is “indistinguishable from an authentic visual depiction of the individual” which in this case it would be very distinguishable since the individual does not have said face tattoo.

        • @AstridWipenaugh
          link
          42 months ago

          How does your legal team compare to Scarlett Johansen’s? There’s your answer where the line is.

      • @[email protected]
        link
        fedilink
        English
        42 months ago

        When does parody/fair use come into play? If it’s a caricature of the person is that okay?

        • Flying Squid
          link
          1
          edit-2
          2 months ago

          Defamation is not parody. Fake porn of someone is absolutely defamation.

          I can’t legally make a “parody” of you but you’re a pedophile.

          Edit: Since there seems to be some confusion, I am not calling them a pedophile, I’m saying I can’t make some sort of fake of them as a pedophile and call it a parody.

            • Flying Squid
              link
              6
              edit-2
              2 months ago

              I’m literally doing the opposite of calling you a pedophile. I’m saying it would be illegal to call you a pedophile and claim it’s a parody. That’s not an excuse for defamation.

              And I said that because I am assuming you are not a pedophile.

              I’m not sure why you didn’t get that.

              • @ticho
                link
                42 months ago

                A bit of unfortunate wording there. :) I had to go back and reread it slowly in order to understand what you meant.

    • @[email protected]
      link
      fedilink
      English
      42 months ago

      Ironically, the face tattoo might convince some people it’s real, since AI has a well known problem with writing coherent text.

    • @[email protected]
      link
      fedilink
      12 months ago

      I don’t think that’s what it means.

      A depiction which is authentic might refer to provenance.

      If someone authorises me to make a pornographic depiction of them, surely that’s not illegal. It’s authentic.

      So it’s not a question of whether the depiction appears to be AI generated, it’s really about whether a reasonable person would conclude that the image is a depiction of a specific person.

      That means tattoos, extra limbs, third books, et cetera won’t side step this law.