72

There's another round of CSAM attacks and it's really disturbing to see those images. It was really bothering to see those and they weren't taken down immediately. There was even a disgusting shithead in the comments who thought it was funny?? the fuck

It's gone now but it was up for like an hour?? This really ruined my day and now I'm figuring out how to download tetris. It's really sickening.

you are viewing a single comment's thread
view the rest of the comments
[-] fubo@lemmy.world 6 points 1 year ago

Even without the issue of new AI-generated images, those hash-based scanning tools aren't available to hobbyist projects like the typical Lemmy instance. If they were given to hobbyist projects, it would be really easy for an abuser to just tweak their image collection until it didn't set off the filter.

[-] snowe@programming.dev 7 points 1 year ago

You can use CloudFlare’s CSAM scanning tool completely for free. You can’t get access to the hashes, which would allow what you are talking about.

[-] fubo@lemmy.world 5 points 1 year ago

Sure, for Lemmy instances who are Cloudflare customers. But I don't think it can be integrated with the Lemmy code by default.

[-] snowe@programming.dev 3 points 1 year ago

No it can’t, and it shouldn’t be. It’s better to stop the CSAM before it ever makes it to any server you control rather than wait and then need to deal with it.

this post was submitted on 19 Sep 2023
72 points (88.3% liked)

Lemmy

11948 readers
10 users here now

Everything about Lemmy; bugs, gripes, praises, and advocacy.

For discussion about the lemmy.ml instance, go to !meta@lemmy.ml.

founded 4 years ago
MODERATORS