21
Mastodon's decentralized social network has a major CSAM problem | Engadget
(www.engadget.com)
This is a most excellent place for technology news and articles.
I know enough about internet porn to know that the online-porn communities will love something like Fediverse, and furthermore, the child-exploitation groups would also love something like this.
But what's surprising to me in this study is that they focused on the top 25 Mastodon servers. They've included specific keywords they were looking for (yall know what keywords I mean), and include a practical methodology involving just hashing files + matching known CSAM databases, rather than forcing a human to go through this crap and picking out what they think is, or isn't CSAM.
It seems like a good study from Stanford. I think you should at least read the paper discussed before discounting it. We all know that even here on the Lemmy-side of the Fediverse, that we need to be careful about who to federate with (or disfederate from). Its no surprise to me that there will be creepos out there on the Internet.
112 hits is pretty small, in the great scheme of things. But its also an automated approach that likely didn't get all the CSAM out there. The automated hits seem to have uncovered specific communities and keywords to use to help search for this stuff + moderate, and includes some interesting methodologies (ex: hashed files compared against a known-database) that could very well automate the process for a server like Lemmy.world.
I see this as a net-positive study. There's actually a lot of good, important, work that was done here.
112 out of 325,000 posts is incredibly small, it's 0.03% of posts