• 𝒍𝒆𝒎𝒂𝒏𝒏
      link
      fedilink
      51 year ago

      It might be resistant to screenshots - unless I missed it, the article didn’t clarify whether the obfuscation process is applied to the image on a per-pixel basis, or within the file format itself…

      If it was that easy to bypass it would be a pretty futile mechanism IMO, one would just need to convert the image to strip out the obfuscation 🫠 or just take a screenshot as you said

      • @SheeEttin
        link
        191 year ago

        Sounds like it’s tiny changes to the image data to trick it. But it also sounds dependent on each algorithm. So while you might trick Stable Diffusion, another like Midjourney would be unaffected.

        And either way, I’d bet mere jpeg compression would be enough to destroy your tiny changes.

        • @esadatari
          link
          21 year ago

          a couple minutes of photoshop and a smudge or burn tool would also negate all the effects

      • @diffuselight
        link
        21 year ago

        These things never work in the real world. We’ve seen this over and over. It’s snakeoil. Latent space mapping may survive compression but don’t work across encoders.

    • @dan1101
      link
      21 year ago

      Yeah it might work in the original format under some conditions but won’t survive a screenshot or saving to another format.

  • @[email protected]
    link
    fedilink
    11 year ago

    The white paper linked’s title is very pragmatic sounding “Raising the Cost of Malicious AI-Powered Image Edit”. Would like to read it deeper later to see what the actual mechanisms deployed are. I know ive considered some form of attestation embedding both in the data and form linked with cryptographic signature. You know for emportant things like politics, diplomacy and celeberty endorsement. /s