A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

  • @[email protected]
    link
    fedilink
    English
    261 year ago

    The actual scary use case for AI porn is that if you can get 50 or more photos of the same person’s face (almost anyone with an Instagram account), you can train your own LoRA model to generate believable images of them, which means you can now make “generic looking” porn with pretty much any person you want to see in it. Basically the modern equivalent of gluing cutouts of your crush’s face onto the Playboy centerfold, only with automated distribution over the Internet…

        • @[email protected]
          link
          fedilink
          English
          71 year ago

          People will have to learn to stop believing everything they see. This has been possible with Photoshop for even more than a decade now. All that’s changed is that it takes less skill and time now.

        • @Silinde
          link
          English
          4
          edit-2
          1 year ago

          Because that’s called Libel and is very much illegal in practically any country on earth - and depending on the country it’s either easy or trivial to put forth and win a case of libel in court, since it’s the onus of the defendant to prove what they said was entirely true, and “just trust me and this actress I hired, bro” doesn’t cut it.

            • @Silinde
              link
              English
              31 year ago

              The burden of liability will then fall on the media company, which can then be sued for not carrying out due dilligance in reporting.

        • Liz
          link
          fedilink
          English
          41 year ago

          We’re going to go back to the old model of trust, before videos and photos existed. Consistent, coherent stories from sources known to be trustworthy will be key. Physical evidence will be helpful as well.