197
submitted 1 year ago* (last edited 1 year ago) by corb3t@lemmy.world to c/technology@lemmy.ml

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

you are viewing a single comment's thread
view the rest of the comments
[-] Spiracle@kbin.social 41 points 1 year ago* (last edited 1 year ago)

Direct link to the (short) report this article refers to:

https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf

https://purl.stanford.edu/vb515nd6874


After reading it, I’m still unsure what all they consider to be CSAM and how much of each category they found. Here are what they count as CSAM categories as far as I can tell. No idea how much the categories overlap, and therefore no idea how many beyond the 112 PhotoDNA images are of actual children.

  1. 112 instances of known CSAM of actual children, (identified by PhotoDNA)
  2. 713 times assumed CSAM, based on hashtags.
  3. 1,217 text posts talking about stuff related to grooming/trading. Includes no actual CSAM or CSAM trading/selling on Mastodon, but some links to other sites?
  4. Drawn and Computer-Generated images. (No quantity given, possibly not counted? Part of the 713 posts above?)
  5. Self-Generated CSAM. (Example is someone literally selling pics of their dick for Robux.) (No quantity given here either.)

Personally, I’m not sure what the take-away is supposed to be from this. It’s impossible to moderate all the user-generated content quickly. This is not a Fediverse issue. The same is true for Mastodon, Twitter, Reddit and all the other big content-generating sites. It’s a hard problem to solve. Known CSAM being deleted within hours is already pretty good, imho.

Meta-discussion especially is hard to police. Based on the report, it seems that most CP-material by mass is traded using other services (chat rooms).

For me, there’s a huge difference between actual children being directly exploited and virtual depictions of fictional children. Personally, I consider it the same as any other fetish-images which would be illegal with actual humans (guro/vore/bestiality/rape etc etc).

If we took this to its logical conclusion, most popular games would be banned. How many JRPGs have underage protagonists? How many of those have some kind of love story going on in the background? What about FPS games where you're depicted killing other people? What about fantasy RPGs where you can kill and control animals?

Things should always be legal unless there's a clear victim. And communities should absolutely be allowed to filter out anything they want, even if it's 100% legal. So the lack of clear articulation of the legal issues is very worrisome since it implies a moral obligation to remove legal but taboo content.

this post was submitted on 24 Jul 2023
197 points (79.4% liked)

Technology

34987 readers
510 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS