545
Anon predicts the future (sh.itjust.works)
top 50 comments
sorted by: hot top controversial new old
[-] racketlauncher831@lemmy.ml 5 points 3 hours ago

Photorealistic porns? What's your problem, man? You have realistic AI and this is all you'll have? Just order a silicon doll and put an AI chip into it! Free ~~sex-sla~~ wife!

[-] nebulaone@lemmy.world 12 points 5 hours ago* (last edited 5 hours ago)

Reputation and PGP signatures could be used to verify real human made content. That is, of course, if people actually care, which I think will be rare.

There might be no-ai communities, that require this and are closed down to avoid being scraped for ai training.

Edit: Also AI is already enshittifiying itself, which might get worse if it becomes more widespread than it already is.

[-] prole@lemmy.blahaj.zone 5 points 2 hours ago

Why couldn't AI use PGP signatures that suggest they're human?

[-] PieMePlenty@lemmy.world 2 points 16 minutes ago* (last edited 14 minutes ago)

Trust is the most important part. You trust someone they made something themselves. They digitally sign their work with a public key that is known to be theirs. You can now verify they (the person you trust) made it.
Once the trusted creator's key is leaked, they are no longer trusted for future works.
AI made content can be freely signed as well, but if you don't trust the origin, the signature doesn't matter anyway since it will just verify it is coming from the AI creator.
The key thing is trust, the signature is just there to verify.

[-] groet@feddit.org 4 points 2 hours ago

Its not about "just having a signature". Its about a web of trust. It only works if you verify if the key belongs to a creator that is actually a person.

Basically creators go to a convention and hand out their public key in person and have other creators sign their key. If you trust creator A is real and they signed the key of creator B, you can have some trust B is also real. And if your buddy went to the convention, met A and B, got their public keys and tells you they are real you can also trust they are real. The more steps/signatures you are away from a creator the less trustworthy they are and nothing really ensures a (human) creator doesn't use AI secretly. If somebody is found to be a fraud everyone has to distrust their key.

[-] wowwoweowza@lemmy.world 5 points 5 hours ago

Read classics:

Pride and Prejudice" by Jane Austen, "Moby Dick" by Herman Melville, "The Great Gatsby" by F. Scott Fitzgerald, and "1984" by George Orwell.

Start here. There are thousands.

[-] silasmariner@programming.dev 4 points 3 hours ago

Why start there with British and US authors? Why not 100 years of solitude, Disgrace, and dream of red mansions?

[-] phoenixz@lemmy.ca 1 points 26 minutes ago

Because that was what he came up with. It's fine to start with. Your selection is fine too to start with.

[-] wowwoweowza@lemmy.world 1 points 28 minutes ago

For the same reason your user name is not buendiablo?

I guess we can all suffer a little eurocentrism from time to time? But yes — enrich the list with international voices! One of my favorite novels is THE PONDS OF WAGABA by Elichi Amadi… a little known gem any fan of George Eliot would love.

[-] alligalli@feddit.org 6 points 6 hours ago

Time to get up and go outside :)

in the futuer we will b fighting the terminators, shotgfun jhon connor

[-] Dogyote@slrpnk.net 7 points 11 hours ago

Simetimes I think the future will resemble the pre-internet era. AI content will be so easy to create that the zone will be flooded with shit, and only a few reputable sources will be trusted, like when there were only a few TV news channels.

[-] StarlightDust@lemmy.blahaj.zone 5 points 11 hours ago

That porn had to be trained on real people's bodies who will never see a penny of it. That's laundered revenge porn.

[-] Rossphorus@lemmy.world 14 points 17 hours ago

Video evidence is relatively easy to fix, you just need camera ICs to cryptographically sign their outputs. If the image/video is tampered with (or even re-encoded) the signature won't match. As the private key is (hopefully!) stored securely in the hardware IC taking the photo/video, any generated images or videos can't be signed by such a private key.

[-] IlovePizza@lemmy.world 1 points 2 hours ago

Wouldn't this be as easy to break as to point a camera at a screen playing whatever you want?

Perhaps not with light field cameras. But then you could probably tamper with the hardware somehow.

[-] topherclay@lemmy.world 10 points 17 hours ago

So whatever way the camera output is being signed, what's stopping you from signing an altered video with a similar private key and then saying "you can all trust that my video is real because I have the private key for it."

The doubters will have to concede that the video did indeed come from you because it pairs with your key, but why would anyone trust that the key came from the camera step instead of coming from the editing step?

[-] Taleya@aussie.zone 6 points 13 hours ago

Mate, digital cinema uses this encryption /decryption method for KDMs.

The keys are tied into multiple physical hardware ids, many of which (such as player/.projector ) are also married cryptographically. Any deviation along a massive chain and you get no content.

Those playback keys are produced from DKDMs that are insanely tightly controlled. The DKDM production itself even more so.

And that's just to play a movie. This is proven tech, decades old. You're not gonna break it with premiere.

[-] tweeks@feddit.nl 3 points 7 hours ago

But how would one simple member of the audience easily determine if this whole chain of events is valid, when they don't even get how it works or what to look out for?

You'd have to have a public key of trusted sources that people automatically check with their browser, but all the steps in between need to be trusted too. I can imagine it is too much of a hassle for most.

But then again, that has always been the case for most.

[-] Taleya@aussie.zone 2 points 7 hours ago

...what audience?

[-] Valmond@lemmy.world 1 points 5 hours ago

This is for restricting use, not proving authenticity of the videos recording. Anyone can spin up keys and sign videos, so in a legal battle it would be worthless.

[-] Taleya@aussie.zone 2 points 5 hours ago* (last edited 1 hour ago)

The technology would be extremely easy to adapt, with the certs being tied to the original recording equipment hardware. Given i don't see a $60 ip cam having a dolphin board it would probably be relegated to much higer end equipment, but any modification with a new key would break the chain of veracity

[-] Valmond@lemmy.world 1 points 4 hours ago

This is blatantly not true, it would be extremely simple to circumvent. How do you "tie" the cert to a specific hardware without trusting manufacturers? You just can't, it's like putting a padlock on a pizzabox.

[-] Taleya@aussie.zone 1 points 1 hour ago

I literally explained earlier how this exact technology is used in digital cinema dude c'mon.

[-] Valmond@lemmy.world 1 points 37 minutes ago* (last edited 36 minutes ago)

That doesn't mean it's useful for forensics, IMO.

Edit: not saying it wont be though, just that it's not as bullet proof as you'd think, IMO.

[-] sugar_in_your_tea@sh.itjust.works 7 points 16 hours ago

You can enter the camera as evidence, and prove that it has been used for other footage. Each camera should have a unique key to be effective.

So if you create a new key, it won't match the one on am existing camera. If you steal the key, then once that's discovered, the camera should generate a new one.

[-] tweeks@feddit.nl 1 points 7 hours ago

But if you don't actually check the physical camera and prove that key for yourself, then it can easily be faked by generating a key that is not coming from the camera and is used for the "proof" video and the fake video.

[-] sugar_in_your_tea@sh.itjust.works 1 points 6 hours ago

Any self-respecting judge would check, and hopefully most journalists would keep records of these things to prove where the footage came from.

[-] Rossphorus@lemmy.world 4 points 16 hours ago

You, the end user, don't have access to your camera's private key. Only the camera IC does. When your phone / SD card first receives the image/video it's already been signed by the hardware.

[-] SethTaylor@lemmy.world 20 points 23 hours ago

Actually, polls show that most people are not fond of AI-generated content and want it to be labelled or don't want it at all.

As for generating your own entertainment at home, see interactive movies. They did not take off because people don't want to be "working" for their entertainment. That's their time to relax and not make decisions.

All in all, we're not as careless as it may seem.

[-] Taleya@aussie.zone 6 points 13 hours ago

A fb group i moderate recently had an AI jammed up it. I ran a poll to keep or disable. "Get rid of it" got more votes than the option "Put a gimp mask on it and whore it out for grapefruit"

[-] ICastFist@programming.dev 6 points 21 hours ago

Not to mention those interactive movies from the early 90s games that also didn't take off because they were sorely lacking in the game department

load more comments (1 replies)
[-] nailbar@sopuli.xyz 48 points 1 day ago* (last edited 1 day ago)

I wonder if personal websites with links to each other, like in the olden days, will start growing in popularity again because of how trust is slowly eroded for anything not in your direct control, and search engines becoming more and more useless 🤔

[-] laranis@lemmy.zip 10 points 21 hours ago

But, but, how will we monetize it? How!?

/s

I long for the early 2k internet. So much potential positivity for humanity.

load more comments (3 replies)
[-] IndiBrony@lemmy.world 45 points 1 day ago* (last edited 1 day ago)

I think this vastly overestimates the average person's ability to recognise or even care to recognise what is AI and what is not.

You've got all those videos on Facebook which are BLATANTLY AI and the comment section is split between "wow, amazing!" and "it's AI you fucking morons"

The latter will eventually leave the platform and the former will be all that's left.

[-] codexarcanum@lemmy.dbzer0.com 1 points 46 minutes ago

You never know how many of those comments now aren't bits and ai also. The malleable human mind sees "people" expressing opinions and wants to take a side, have an opinion. How convenient that all the options, all the feelings and responses you should have, are already laid out for you. Just "Like and Subscribe" to the persons whose opinions most align to your own.

[-] Valmond@lemmy.world 3 points 5 hours ago

Then people jack of to manga (and more) knowing it is not real, and sees through the computer generated images in star treck but still love it. I think you underestimate how malleable the human mind is when we want to!

load more comments (5 replies)
[-] BigBananaDealer@lemm.ee 137 points 1 day ago

in fact, this green text was made purely from asking chatgpt what ai will look like in 10 years

load more comments (7 replies)
[-] elvis_depresley@sh.itjust.works 23 points 1 day ago

Part of the fun of watching stuff isn't because it "customised to me" it's sharing an experience with the creator(s) and friends, family etc.

I see genAI being used as a tool for creators but not as an automation of content creation.

[-] rtxn@lemmy.world 3 points 14 hours ago

AI would be chronically incapable of implementing actually surprising plot twists that are both unexpected and consistent with the rest of the plot (and not somehow someone back into existence). If it hadn't been written before, an AI would never make Darth Vader be Luke's father unless specifically prompted, at which point, why even.

(I've just finished a hexalogy marathon, my head is full of jedi.)

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 14 Jun 2025
545 points (97.6% liked)

Greentext

6494 readers
921 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 2 years ago
MODERATORS