Hello everyone,
We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.
We keep working on a solution, we have a few things in the works but that won't help us now.
Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.
Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.
But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.
Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless.
Anyway, I hope we can announce something more positive soon.
I feel like you didn't actually read their comment before posting, !dipshit@lemmy.world
It has nothing to do with Lemmyshitpost being their "favorite community" and they never mentioned "investing" or "value". That's all from you. Stop strawmanning their position. They were criticizing the ease with which entire communities can be taken down by single individuals. Additionally, it seems you are contradicting your own post from 20 minutes prior to your current comment. Perhaps you responded to the wrong comment?
I read it. Let’s read it together again.
The developers who build lemmy aren’t able to put in CSAM blocking code. That’s not how this works. I assume the commenter meant to say “admins” here, as developers write the code, they don’t admin the sites. If a developer has a lemmy instance they admin, they they are both a dev and an admin. Lemmy wasn’t built for CSAM sharing specifically, it is a site that allows sharing of CSAM as much as reddit or facebook do. The devs can’t do much about this. The admins and mods can.
Neat. Irrelevant, but cool.
This I take issue with, and is what I mostly responded to.
“If taking the community down is the only option here” well no, it’s not. We could just get 100’s of mods to specifically address this one user’s posting of CSAM. Hey, anyone want to moderate the site? Oh right, and they’ll need to be vetted, and they’ll need to keep doing this on the side for free as volunteers since lemmy is volunteer run..
“that's extremely insufficient” hard disagree. A community is liable for the content on it. If we put a CSAM post up on a site and leave it around for a few minutes, that’s one thing. If it’s left up for days and weeks, that’s quite another problem entirely. The minute that an admin or mod saw CSAM material, they did the right thing by shutting that down. Even if it means downtime for users. Oh no! Users can’t read lemmyshitpost and now the world is ending.
“and bodes death for the platform at the hands of uncontrolled spam.” Welcome to the internet, where all platforms are at the risk of uncontrolled spam. At first it was just email, but then it was bulletin boards, and then message boards, and then forums, and then community-moderated forums like reddit and lemmy. This has and will be a problem. This isn’t a new concern for lemmy devs or admins or mods, they all are aware that this can happen and is why they do what they do. Turning off the community is a viable option, and is what has happened in larger companies too while they cleaned up the mess.
I’ve been very consistent in my arguments. Show me the contradiction and I’ll address it.
TL;DR: users cannot expect to be allowed to post CSAM material on lemmy instances. Allowing CSAM material to be up on lemmy instances constitutes a legal risk for admin owners, and thus we cannot leave it up. Blocking a community (even if it’s like the bestest and most favorited and most subscribed and everyone loves it and wow just super-duper community) is a viable means of blocking all CSAM on that community while it is cleaned up. To suggest that the community should have stayed online is assinine. To suggest that the admins should not have blocked a community to combat CSAM is assinine. Trust admins to do their jobs.
They aren't asking devs to be admins or for admins to be devs. They specifically called out the developers because code exists to filter child sexual abuse material, disseminated by organizations such as the FBI and law enforcement, which can be implemented for image uploading.
NOBODY in this comment section is advocating for uploading fucking child sexual abuse material. That is a strawman you are setting up. Nobody is advocating for allowing the uploading of child sexual abuse material, or for the "material to be up on lemmy instances". NOBODY is suggesting that a single instance going down is "the world is ending". NOBODY is asking for "100's of mods to specifically address this one user's posting of CSAM".
You're setting up a strawman argument nobody is proposing. The criticism is that, at this moment, the developers of Lemmy have not implemented a method for automatically vetting uploaded images for CSAM without requiring "100's of mods", which is what resulted in the condition that "taking the community down is the only option here".
Perhaps the wording of the original post was not precise and accurate enough for your full and complete understanding of the intent and meaning behind it. In this post, I have attempted to elucidate that intent and meaning to a degree which I hope is understandable to you.
Yeah? I doubt this is true but I could be wrong. You make it sound like preventing CSAM is as simple as importing a library, something I find dubious. Companies have been trying to filter out this material in an automated fashion for decades and yet they still have to employ humans to do it manually because automated means don’t really work. This is why companies like Reddit, facebook have trust and safety teams to do this work.
Edit: I goggled and could not find this database. I’m thinking it’s a myth.
ahem there were users who uploaded CSAM. Those are the users who were advocating for uploading CSAM, becuase they uploaded CSAM.
I’m literally arguing with people who are saying that they shouldn’t have shut down the community because it’s big and that shutting down the community (not CSAM) poses an threat to the fediverse. Maybe, but CSAM poses a legal threat, which is much greater than the threat of low engagement.
Yeah, that doesn’t exist, as I’ve mentioned previously. You make it sound like getting CSAM off lemmy was as simple as writing some code - if it were, why doesn’t facebook and reddit do this?
You’re not understanding how CSAM detection works or is handled.
The grim reality is this: cameras exist, children exist, adults exist, the internet exists, and the second that a crime is committed, it is not added to an FBI database. If such as FBI database existed and IF it was useful (and not just a database of hashes for bit-perfect copies of CSAM) and IF it were updated when evidence of the crime surfaces.. IF all of those things are true, THEN it means there’s still likely a huge swath of CSAM material still out there, that could be posted at any time, and that would NOT be detected.
Again ask yourself, IF such a database existed, then WHY does reddit, twitter, facebook, hell, why doesn’t every or any site use it?
Pedophiles, instead of downvoting me, why not explain yourself?
As another commenter posted below:
As far as I am aware, every major site does use it in addition to manual vetting for any flagged "borderline" or "uncertain" results caught up in the filter.
Db0 even created a tool for Lemmy:
https://lemmy.dbzer0.com/post/2896209
I think this is where you could be wrong here. I appreciate the links, I’ll look into those in more detail. My best understanding is that these tools generate so many false-positives and false-negatives that it’s not worth using them. It may be a first line of defense until real humans get to see them, but my point is that humans are still needed. When humans are included because the system isn’t 100%, it means humans do the labor and as such, with limited time, humans need to determine when they can do the labor - sometimes shutting down a community is the best way to stop the flood while they clean up the mess.
This is just a matter of confirmation bias from your side now. You stubbornly refuse to accept factual information very helpfully delivered to you by users who have many better things to do than respond to your inquiries, and you dogmatically refuse to acknowledge that there are advanced and reliable automation tools available for the use case in question. And while you do all that, you belittle the other users in the community by referring to your supposedly superior knowledge and experience, however somehow failing to provide any data or secondary sources to back up your claims.
They absolutely can, and every forum under the sun has tools and extensions to help with this. Fucking 4chan has code specifically dedicated to deal with CSAM. You have no clue what you're talking about.
Replace this with !technology@lemmy.world, or !selfhosted@lemmy.world, or !announcements@lemmy.world. "Oh no, users can't read the entire site" yes that is the definition of the end of the site.
You're not seeing that this isn't a lemmyshitpost issue, it's an "any popular community on lemmy" issue. Snarkily taking potshots at lemmyshitpost as a community doesn't change it.
It's not "not an option", it's the last resort. It's like saying that your only option to seeing a roach in your apartment is to burn the whole building down. Because doing it means you don't have a community anymore, and without communities the site has no purpose.
Cool, name them and give me links then. I could not find such on the internet. There is software that tries to detect this, but even youtube’s algorithm is incorrectly flagging fully clothed 30 year old women as children.
Links or you’re talking out your ass.
Replace a site with CSAM and you’ll find it’s not a site you’ll want to go to in the first place. READ the original post where it was mentioned this is a stop-gap measure.
You’re.. offended that I had snark on the topic of lemmySHITPOST? surely, you are joking.
My point is not that this community is shit and that’s why this happened.
My point is that this is a community on a lemmy instance that was flooded with CSAM, and was shutdown because of the flood of CSAM.
You do see how turning a community off and then on again isn’t the same thing as burning down a house (and unburning it again?)
You do realize that we’re talking about a literal crime against children vs your ability to see memes? Fuck off with your self-importance.
Are you being intentionally dense or do you not understand that it's my point? If someone can flood lemmy with CSAM so easily that the only way to stop it is a site shutdown, then there are not sufficient mitigation measures in place.
Yes, this is the Internet. Take your statement and replace “lemmy” with “reddit” “facebook” “9gag” “imgur” etc.. No site has “sufficient mitigation measures in place” as CSAM continues to flood the internet.
“Flooding” a site with CSAM is a matter of opinion. If one person posted one image of CSAM on my instance, that would be flooding - that’s one image too many. It’s not like there’s some magic threshold of the amount of CSAM allowed on a site. All sites use human moderators to detect CSAM and all sties who do this have teams that are far too small and far too underpaid for the most part.
Underpaid being the keyword here, as lemmy admins are volunteers. I would think that the threshold for “flooding” a lemmy instance with CSAM would be far lower than that of a major for-profit site.
it's true, if I remember correctly, tumblr was removed from the App Store because of CSA issues. I could be remembering wrong and maybe it was the Google Play Store.
This is likely going to be an issue for any site that hosts nsfw content. It’s easier to have mods just ban all NSFW content without trying to go into trying to figure out if the people in the content are consenting adults.
If the admins of nsfw instances aren’t already on high alert, they should be now.
Yeah, I fully agree with that. If people want porn, they should go to porn sites.
have you ever talked to janitors and mods on 4chan? Good luck getting any info out of them.
Do you realize that 4chan isn’t the full internet? That these programs that you already know of can exist outside of 4chan? I’m asking you - the person who knows of these apps - to provide links to back up your claims.
I'm not the other person you responded to and I never claimed to have an apps or links.
I'm just telling you how this works on 4chan. I'm aware that's not the entire internet obviously - your sarcasm needs work considering we are both here on Lemmy, ie, not 4chan.
If anyone on there is using these programs/apps/whatever, they're not just gonna tell other people about them.
And as far as I know, I haven't been on 4chan in like 3 years not but they region ban for CSAM.
I don’t really keep track of who I’m responding to. I just respond to comments. Sorry if I got mixed up here.
My comments still stand, though the snark isn’t directed towards you specifically.
I just want people to understand that there isn’t a solution that we all think exists in other places and not here. The solution is largely people. People who get PTSD from viewing and moderating these images. It’s not a good solution but it’s the best solution we have so far.
The other truth of the matter is if the Internet itself were to hypothetically shut down, this content will just be distributed via other means. The one nice thing about the internet is that lots of stuff is tracable back to the person who posted the infringing material.
I agree with you on this. I agree with a lot of things you're saying.
part of responding means you know who you are talking to. I know it can be easy to forget someone is behind the screen here but there are in fact other human beings responding to you.
Just try to be more considerate, if anything.