11
Discord Popcorn Picture
(sh.itjust.works)
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
Looking for support?
Looking for a community?
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
From what other people have said and from the occasional video that's popped up on Youtube, Discord has a library of CSAM content that its automated systems match against and there are certain individuals that try to bait people to post seemingly innocent pictures that are actually frames from said videos. Discord's systems see that the image is a frame from such material and will auto-ban the account
This is fascinating and I have a bunch of questions, basically all centered around the fact that possession of the such content is outlawed. I don't exoect OP to know, but maybe someone else does:
Isn't it illegal to have a library of such content? Is there a legal carveout for that, like Coca Cola importing cocaine?
How is the library compiled, maintained, and added to?
Is the library specific to Discord or is it a shared library maintained by some centralized "authority" or developer? If it's specific to Discord then can we assume there are many different libraries of illegally produced and possessed content compiled and maintained by various social media companies? Who's got that job? Do they get therapy in their benefits package?
As far as I understand they use a tool called PhotoDNA (AI company acquired by Discord) which they use to scan pictures.