The White House wants to ‘cryptographically verify’ videos of Joe Biden so viewers don’t mistake them for AI deepfakes::Biden’s AI advisor Ben Buchanan said a method of clearly verifying White House releases is “in the works.”

  • @[email protected]
    link
    fedilink
    English
    28 months ago

    Honestly I’d say that’s on the way for any video or photographic evidence.

    You’d need a device private key to sign with, probably internet connectivity for a timestamp from a third party.

    Could have lidar included as well so you can verify that it’s not pointing at a video source of something fake.

    Is there a cryptographically secure version of GPS too? Not sure if that’s even possible, and it’s the weekend so I’m done thinking.

    • @[email protected]
      link
      fedilink
      English
      28 months ago

      It’s way more feasible to simply require social media sites to do the verification and display something like a blue check on verified videos.

      This is actually a really good idea. Sure there will still be deepfakes out there, but at least a deepfake that claims to be from a trusted source can be removed relatively easily.

      Theoretically a social media site could boost content that was verified over content that isn’t, but that would require social media sites to not be bad actors, which I don’t have a lot of hope in.

      • @kautau
        link
        English
        4
        edit-2
        8 months ago

        I agree that it’s a good idea. But the people most swayed by deepfakes of Biden are definitely the least concerned with whether their bogeyman, the “deep state” has verified them

    • Natanael
      link
      fedilink
      English
      2
      edit-2
      8 months ago

      Positioning using distance bounded challenge-response protocols with multiple beacons is possible, but none of the positioning satellite networks supports it. And you still can’t prove the photo was taken at the location, only that somebody was there.