36

An in-depth report reveals an ugly truth about isolated, unmoderated parts of the Fediverse. It's a solvable problem, with challenges.

you are viewing a single comment's thread
view the rest of the comments
[-] xilliah@beehaw.org 2 points 1 year ago

Thanks for the thought you put into your answer.

I've been thinking: CSAM is just one of the many problems communities face. E.g. Youtube is unable to moderate transphobia properly, which has significant consequences as well.

Let's say we had an ideal federated copy of the existing system. It would still not detect many other types of antisocial behavior. All I'ms saying is that the existing approach by M$ feels a bit like it's based on a moral tunnel vision and trying to solve complex human social issues by using some kind of silver bullet. It lacks nuance. Whereas in fact this is a community management issue.

Honestly I feel it's really a matter of having manageable communities with strong moderation. And the ability to report anonymously, in case one becomes involved in something bad and wants out.

Thoughts?

this post was submitted on 06 Aug 2023
36 points (95.0% liked)

Fediverse

17535 readers
55 users here now

A community dedicated to fediverse news and discussion.

Fediverse is a portmanteau of "federation" and "universe".

Getting started on Fediverse;

founded 4 years ago
MODERATORS