643
submitted 8 months ago by L4s@lemmy.world to c/technology@lemmy.world

The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for AI deepfakes::Biden's AI advisor Ben Buchanan said a method of clearly verifying White House releases is "in the works."

(page 2) 50 comments
sorted by: hot top controversial new old
[-] recapitated@lemmy.world 7 points 8 months ago

It's a good idea. And I hope to see more of this in other types of communications.

[-] FrostKing@lemmy.world 5 points 8 months ago

Can someone try to explain, relatively simply, what cryptographic verification actually entails? I've never really looked into it.

[-] abhibeckert@lemmy.world 4 points 8 months ago* (last edited 8 months ago)

Click the padlock in your browser, and you'll be able to see that this webpage (if you're using lemmy.world) was encrypted by a server that has been verified by Google Trust Services to be a server which is controlled by lemmy.world. In addition, your browser will remember that... and if you get a page from the same server that has been verified by another cloud provider, the browser (should) flag that and warn you it might be

The idea is you'll be able to view metadata on an image and see that it comes from a source that has been verified by a third party such as Google Trust Services.

How it works, mathematically... well, look up "asymmetric cryptography and hashing". It gets pretty complicated and there are a few different mathematical approaches. Basically though, the white house will have a key, that they will not share with anyone, and only that key can be used to authorise the metadata. Even Google Trust Services (or whatever cloud provider you use) does not have the key.

There's been a lot of effort to detect fake images, but that's really never going to work reliably. Proving an image is valid, however... that can be done with pretty good reliability. An attack would be at home on Mission Impossible. Maybe you'd break into a Whitehouse photographer's home at night, put their finger on the fingerprint scanner of their laptop without waking them, then use their laptop to create the fake photo... delete all traces of evidence and GTFO. Oh and everyone would know which photographer supposedly took the photo, ask them how they took that photo of Biden acting out of character, and the real photographer will immediately say they didn't take the photo.

load more comments (1 replies)
load more comments (2 replies)
[-] Darkassassin07@lemmy.ca 4 points 8 months ago

I'm more interested in how exactly you'd implement something like this.

It's not like videos viewed on tiktok display a hash for the file you're viewing; and users wouldn't look at that data anyway, especially those that would be swayed by a deep fake...

load more comments (5 replies)
load more comments
view more: ‹ prev next ›
this post was submitted on 11 Feb 2024
643 points (97.9% liked)

Technology

58898 readers
3668 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS