20
you are viewing a single comment's thread
view the rest of the comments
[-] sxan@midwest.social 5 points 1 year ago

We completely agree that it's a difficult question, and a slippery slope. And also on the point of government's role.

Do you them believe that privately run platforms shouldn't have the right to choose what gets put on their platform? Or is it a matter of scale, like, Sxan's GoToSocial server can do what it wants, but The-Platform-Formally-Known-As-Twitter shouldn't?

I always think of the brigading that happens on "open" platforms. The Masses will effectively censor any real debate, but especially if they know there are no rules. How are we to deal with that?

[-] jet@hackertalks.com 4 points 1 year ago* (last edited 1 year ago)

You bring up some excellent points. Right now, there's private organizations that are acting as de facto public squares. I think when they're the only option, it gets muddy, if they're going to be so essential to society, they have to operate like utilities, not have opinions beyond legal or illegal.

For all the platforms, that are just options, but are not de facto public squares, I'm perfectly happy for them to have opinions about what can or cannot be said.

Let's take the fediverse, as an example, any individual server can have its own opinions, and enforce them through moderation - perhaps very heavy moderation. And that's totally fine. Because any group of people can run their own instance, and have their own moderation policies, and it's all on an equitable playing field.

The brigading, that we're seeing on these open platforms, is the early adopter phenomenon, and that groups tend to move together. The only real solution, is heavy moderation, on different instances. So if you have a community that's talking about fishing, the moderator should prevent brigading from discussing things that don't relate to fishing. "Person I hate was caught fishing, we should ban them from fish, etc etc oh you support fish killing...." the moderator should stop that.


The litmus test I would use, to determine if a social media company was a public space, and should act by utility rules, rather than private club rules - would be, does the government use that platform to communicate with citizens?

X-twitter, Facebook... both have governments using them to communicate directly if their citizens, sometimes as the only means of communication. So they are de facto public squares and utilities.

[-] Facebones@reddthat.com 2 points 1 year ago

I think a publicly funded platform would be beneficial in today's world. Nobody can be banned but you can still block people individually. (Criminal stuff would still be criminal and you could potentially be muted by govt entities though) All govt communication would be through this platform, so nobody can be "walled off" from govt comms. It would still function as social media as well, but people would be free to twit/fb/whatevs - there would just no longer be govt entities there.

It would also lay the framework to potentially move our voting systems into the 21st century IMO.

[-] sxan@midwest.social 2 points 1 year ago

So, I've left this on "unread" for so long only because until now I only used Lemmy on my phone, and I really hate typing long replies on my phone. I wanted to give your reply due consideration, though. Anyway, I'm embarassed to have taken this long to respond.

I agree with you about the public square, and I think you bring up an excellent point about these systems becoming "essential to society." I think it's a thing that is obvious to younger people, and almost completely invisible to older people. Even those of us who grew up during the IT boom decades and lived through the change may find it difficult to grok just how much of an impact this is having. I do think that people are generally well aware of how slow legislation is in adapting to rapid changes in society, but the impact you talk about has happened at such an accelerated rate, useful precedents are lacking. So we see legislators thrashing about more than usual, over or under-reacting, and mostly in extreme ignorance.

I see brigading in the fediverse as a worse problem than you do. It's mob rule, and it is unchecked largely -- I feel -- as a result of hesitance by moderators to be accused of censorship. I haven't yet seen much of what Reddit suffers from -- moderator affinity, where mods have a heavier hand with posters they disagree with -- but the result is unchecked herd mentality cowing dissenters.

But, maybe mob rule is good? I vacillate on this one. A well-functioning, healthy society has laws controlling gross topics, and social censure is used to moderate distructive elements. We don't want a society where we have laws for every little infraction; in that society, every citizen is a criminal by default, and the government always has a legal justification to persecute everyone they want to (and let slip those they don't). OTOH, we have what happened in the US in the 50's, with mobs of white people harrassing black integration students. I don't know what the right answer is for this, honestly, but it is an issue in meatspace, and it's as much or more of an issue online.

Your litmus is good, I think, but risks being based largely on our current clueless government. As the generations age out, and younger generations take control, the government will become increasingly social-media savvy. I can easily see a future government having a communications department that is competent enough to hit nearl every social media platform, regardless of popularity. What about cross-posting? If we use that litmus, then if I were the government and wanted to control a platform, all I need to do is start posting to it and now it qualifies as subject to regulation?

I think I've said before, but I'll repeat it: I don't have answers to any of these issues. I wish we could have a censorship-free internet; there was a time in the early history when most users were well-behaved and followed established etiquette. I think a lot of that may have been due to the lack of anonymity, but whatever the reason, we've been past that for decades, and we haven't yet adapted.

[-] jet@hackertalks.com 2 points 1 year ago

Thank you for the very thoughtful reply.

The brigading is a huge problem and discourages people from joining lemmy, we need highly opinionated moderated communities to create "safe spaces" for niche communities and viewpoints. The inclusion of "user participation requirements", like account age, interaction with a community, karma scores in community - are necessary to help lemmy grow.

From a long term stability of society perspective, absolute free speech is the only path forward. Yes, people we hate will have voices, and people who are criminal will have voices, but that is the price of giving everyone a voice. We only have to look at the diversity of "governments" globally to realize having a community focused, respectful, government is a temporary thing. Governments change with time, with those enforcing the rules. Just as a thought experiment imagine you lived your entire life in every country, and imagine you wanted to advocate for 1. human rights, 2. a political opposition party. In many countries, that is aggressively stamped out, "don't rock the boat". In many global communities' doing 1 and 2 are great ways to embarrasses powerful people and have a short life.

I know many people will think, "yes, but... what about thing I don't like X"... If we create the digital tooling to ban X, whatever X is, then those in power will use that tooling to target everything else. Tools in the toolbox get used. Its a difficult stance to be a free speech absolutist, its unpopular, but I think its necessary. I'm not saying communities have to suffer outsider speech intruding on their spaces, but that platforms cannot be opinioned as a whole.

You bright up very thoughtful points and I agree censorship is necessary to grow communities, but censorship should never get larger then the community level. Platform level censorship is bad for society in the long term.

[-] sxan@midwest.social 1 points 1 year ago

Okay; you're making a distinction between "moderation" and "censorship" that I don't understand. Does it go back to your litmus of an "important public space?"

[-] jet@hackertalks.com 1 points 1 year ago

Moderation: not deplatforming, but putting rails on a specific discussion

Censorship: deplatforming, total limits on a topic in all places.

I.E. anyway can send mail in the post office. A news letter editor moderates the received letters for inclusion in their publication.

So in a Lemmy context, it's not censorship to have rules on a instances, but it would be censorship to deny people the ability to run a instance. Lemmy is very censorship resistant.

[-] sxan@midwest.social 2 points 1 year ago

Are you suggesting that there are no topics, no content, that should be censored? I'm not trying to walk you into Godwin's law; I just don't see how you address issues like CP, snuff porn, or hate/incentivizing speech. I personally would rather err on the conservative side of the Paradox of Tolerance, than allow intolerence to take hold and take over. With total and complete freedom of expression, how do you prevent the emergence of populist oppressive movements like the Khmer Rouge, or the Nazi party? Or do you think the Paradox of Tolerance is flawed?

[-] jet@hackertalks.com 1 points 1 year ago

First, let me take the opposite position, with restricted and curated freedom, how do you prevent people from being oppressed?

The speech itself should not be censored, that includes the objectionable things you mentioned. If a country, or a government wish to make some speech illegal, that it should be up to the courts to remove somebody's speech, through a due process and public discourse.

I take a different position on the paradox of tolerance, the issue is sitting idly by, while groups are being excluded. Open debate, and rational thinking, are required by all countries in the world, and all the citizens of the world, to prevent terrible abuses from happening again. My takeaway, is everyone should fight tooth and nail, to prevent any group from being excluded - including groups we don't like.

I've seen the paradox of intolerance used as rhetorical ammunition to silence opponents online, and that just turns into another form of tyrrany of the current ingroup.

To prevent another opressive government from taking hold (like your examples), we have to trust in people's engagement and wisdom, and the open healthy debate of ideas. We can, of course, help people, through economic stability, critical thinking education, etc...

If we say some thoughts are tok dangerous to be spoken, for fear people are too easily lead astray.. then we are trusting that those who choose which voices are worth hearing will always be benelovent dictators.. the one lession I take away from history is that power rarely stays in the hands of the benelovent. Open communication, organization, and free thought is the most effective way to protect a population.

TLDR: thought crime and wrong think shouldn't ever exist, legally at least.

[-] partizan@lemm.ee 2 points 1 year ago

But the governments prefer the current situation, as they have channels to ask for removal, but have zero liability and the company is covered, as they can do as they please, because its their private platform where they are allowing them. So I dont see why would the government declare social media as public squares...

this post was submitted on 24 Sep 2023
20 points (71.7% liked)

Privacy

32169 readers
350 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS