29
submitted 3 months ago by silence7@slrpnk.net to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] lightsblinken@lemmy.world 0 points 3 months ago

videos need to be cryptographically signed and able to be verified. all news outlets should do this.

[-] captain_aggravated@sh.itjust.works 2 points 3 months ago

Cryptographic signatures are something we should have been normalizing for awhile now.

I remember during the LTT Linux challenge, at one point they were assigned the task "sign a PDF." Linus interpreted this as PGP sign the document, which apparently Okular can do but he didn't have any credentials set up. Luke used some online tool to photoshop an image of his handwriting into the document.

[-] freedom@lemy.lol 1 points 3 months ago

Such a simple idea, not you designed authorized vendors and keep the signing keys secure forever.

While you’re at it, make sure the chip itself can’t be reverse engineered and run in a VM.

It’s been around with printer fingerprinting unique yellow dot patterns. I think the key is purchase dates that can be tied to locations and identities. But now you’re into big brother status.

What you need instead is a a public ephemeral bloom filter with plausible deniability built in.

We allow everything to sign, black box only overlays bloom filter bits. Have some longer lived biotokens around to extra harden it between manufacturers.

Now you’re in the real world and journalism. Camera equipment is disposable, so just rotate the key on unit build, load it in an fips wipe cipher to be extra sure.

Now you need that bloom central database with some “stewards” to be trusted hosts. No CA shenanigans anymore.

Picture output, steg encoded for authenticity by real lens IR laser verified. Can be traced if need be, but probably unnecessary.

[-] danhab99@programming.dev 0 points 3 months ago

The NFTs tried to solve this problem already and it didn't work. You can change the hash/sig of a video file by just changing one pixel on one frame, meaning you just tricked the computer, not the people who use it.

[-] lightsblinken@lemmy.world 2 points 3 months ago

so try again? also: if a pixel changes then it isn't the original source video, by definition. being able to determine that it has been altered is entirely the point.

[-] Kissaki@feddit.org 0 points 3 months ago

By changing one pixel it's no longer signed by the original author. What are you trying to say?

[-] danhab99@programming.dev 1 points 3 months ago

Exactly that, if I change a pixel then the cryptographic signature breaks

[-] panda_abyss@lemmy.ca 0 points 3 months ago

That’s not really feasible without phones doing this automatically.

Even then didn’t the first Trump admin already argue iPhone video can’t be trusted because it’s modified with AI filters?

[-] lightsblinken@lemmy.world 1 points 3 months ago

... so make the phones do it?

i mean, its not rocket surgery.

[-] TheBlackLounge@lemmy.zip 0 points 3 months ago

Sign every video automatically? Sounds like chatcontrol all over.

Also, I could just generate a video on my computer and film it with my phone. Now it's signed, even has phone artifacts for added realism.

[-] lightsblinken@lemmy.world 2 points 3 months ago

i think the point is to be able to say "this video was released by X, and X signed it so they must have released it, and you can validate that yourself". it means if you see a logo that shows CNN, and its signed by CNN, then you know for sure that CNN released it. As a news organisation they should have their own due diligence about sources etc, but they can at least be held to account at that point. versus random ai generated video with a fake logo and fake attribution that is going viral and not being able to be discredited in time before it becomes truth.

this post was submitted on 10 Oct 2025
29 points (91.4% liked)

Technology

78871 readers
622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS