183
submitted 6 hours ago by kokesh@lemmy.world to c/funny@sh.itjust.works
you are viewing a single comment's thread
view the rest of the comments
[-] Elaine@lemmy.world 8 points 3 hours ago
[-] sp3ctr4l@lemmy.dbzer0.com 8 points 3 hours ago* (last edited 3 hours ago)

I updated my comment with a lot more info.

For the record, I was a stupid kid who primarily wanted to talk about video games ... I noped the fuck out pretty hard after I realized how truly deranged most of the people/content on 4chan was.

... and obviously after the uh, FBI cooperation message.

[-] sp3ctr4l@lemmy.dbzer0.com 4 points 2 hours ago* (last edited 2 hours ago)

Sorry to double post but this is potentially relevant:

So, yeah, the FBI maintains a large database of CSAM, for their... CSAM content ID style system.

People who know that that exists have occasionally pointed out that... that is kind of weird, lets just put it that way.

Well, I've just had a horrifying realization.

We currently have AI/LLM corpos on the record going to places like AnnasArchive to acquire their vast troves of data to use in training an LLM.

... If somebody hooks up this FBI CSAM database... into an LLM... well, you now basically have a machine that produces CSAM, and likely also gore/torture videos.

... hooray ...

Like uh, Pete Hegseth just ... you know, hooked up Grok/XAI into... apparently I guess all of the US's military systems.

The scenario I am describing is unfortunately plausible.

[-] JPAKx4@piefed.blahaj.zone 2 points 1 hour ago

Good news, that's not how the CSAM database works. What happens is the image is hashed (for non technical people, hashing is a way of always generating the same "ID" from the file data) and the image is discarded. Now the article I found to verify my claim also talks about how the FBI used to distribute CSAM to catch people, so they have some probably... But not a database wide amount.

https://cybernews.com/editorial/war-on-child-exploitation/

[-] sp3ctr4l@lemmy.dbzer0.com 1 points 1 hour ago

Well... I'm going to hope you are more right than wrong and that they do not actually have a massive CSAM database.

[-] ZombiFrancis@sh.itjust.works 2 points 2 hours ago

You may have missed how there's a massive CSAM problem with AI because the models are all trained on CSAM. "Plausible" is a bit behind the times.

this post was submitted on 06 Feb 2026
183 points (98.4% liked)

Funny

13535 readers
1287 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS