342
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 28 Aug 2023
342 points (98.3% liked)
FediLore + Fedidrama
2159 readers
105 users here now
Chronicle the life and tale of the fediverse (+ matrix)
Largely a sublemmy about capturing drama, from fediverse spanning drama to just lemmy drama.
Includes lore like how a instance got it's name, how an instance got defederated, how an admin got doxxed, fedihistory etc
(New) This sub's intentions is to an archive/newspaper, as in preferably don't get into fights with each other or the ppl featured in the drama
Tags: fediverse news, lemmy news, lemmyverse
Partners:
founded 2 years ago
MODERATORS
I'm sorry, but I have difficulty being polite to someone who has actively ignored addressing safety concerns that were brought up months ago. FOSS or not.
Stop misconstruing it as safety. It's about legality. Nobody's safety is in jeopardy because they saw an illegal image accidentally. This is about following the law, not protecting the safety of users.
You know, except for those abuse victims whose pictures are being spread around lemmy. Just sayin'
The theory behind why CSAM is illegal is that if someone is willing to pay for CSAM it incentivizes production of even more CSAM content to receive more payment. That incentivized additional production means even more abuse. A perfectly reasonable take and something that I think can be demonstrated.
But why would you accidentally seeing CSAM prompt you to give payment to create that incentivization? Are you worried that you're a closeted pedophile that will be ready to shower those who record such content to see more and more as soon as you get your first taste?
I thought it was pretty apparent we were talking about Lemmy, but okay.
The statements were about the Lemmy devs can and/or should be doing for safety. They simply do not have the power to stop child abuse by developing a social media platform. So then the safety in question must be the safety of people using Lemmy, because the Lemmy devs have some direct power over that.
I'm sure you feel very morally aloof with your righteous retort, though.
It ties into safety as well, websites have "trust and safety" teams. This is where it falls under. Sorry for not being more concise.
No need to apologize, I just think safety is a misnomer here.
"CSAM laws aren't for the safety of real people" is one of the hottest takes I've ever seen in my life
Straight outta reddit with that one.
I'm just going to copy paste my other comment:
I thought it was pretty apparent we were talking about Lemmy, but okay.
The statements were about the Lemmy devs can and/or should be doing for safety. They simply do not have the power to stop child abuse by developing a social media platform. So then the safety in question must be the safety of people using Lemmy, because the Lemmy devs have some direct power over that.
I'm sure you feel very morally aloof with your righteous retort, though.
Yes. Obviously we're talking about Lemmy. We just still fundamentally disagree on the forms of harm, psychic and physical, that can be experienced through the rapid propagation of CSAM. Lemmy's lacking mod tools have been a major topic of discussion for a while now. I don't care to carry on this conversation because it's clear our starting points are too far apart to meet in the middle
I think the other guy's comment is well suited as a response to this, so again I'll copy paste:
How could reason possibly prevail when the subject matter is so sensitive?