129
‘Harmful’ content should not be promoted via social media algorithms, peers say
(www.nationalworld.com)
This is a most excellent place for technology news and articles.
Who is in charge of what's deemed "harmful"? How do they decide? Who put them in charge?
The only practical answer is that users should be able to decide for themselves.
Anything else just devolves into government or corporate censorship.
There’s no such thing as corporate censorship. That’s largely been manufactured from some who have a persecution complex. The only way corporations can censor someone is if that person is accessing property or platforms that the corporation owns. At which point, they have freedom to do what they please when it comes to who they will host. That would be like saying your neighbor is censoring you because they won’t let you on their property or use their things. They can’t legally do anything more than remove you or deny you access to things they operate. The government censorship is a logically real thing in that they have the power to create laws that affect you regardless of property/object ownership
This is a semantic argument made to ignore the issue. The reality is that social media platforms effectively have become the "town square" where ideas are shared. Stifling legal speech in that environment is very effective censorship of ideas.
You can argue that corporations have that right because they own the network. I disagree. Curation of what can be said on their platform turns them into a publisher, not a communications provider. Any lawyer active in that space could tell you how insanely detrimental it would be for that distinction to be made, at least in the U.S.
Imagine your phone company deciding you can't say certain words to other people using their service without facing dropped calls, suspensions of service, or being banned. All because your legal speech goes against the morality of the majority.
That's essentially what social media does at the moment. They are legally defined as, and receive the benefits of, a communications service. But they are acting like a publisher, deciding what is and is not allowed to be said. It's a serious problem.
I’m not being semantic or trying to ignore an issue. I think corporations should have the ability to determine what content or ideas they want to promote or host on their platforms. And it has nothing to do with morality of majority, as much as I personally despise Fox News and specifically Rupert Murdock, they’ve created an identity that caters to people with a certain viewpoint. They’ve gone further and actually shaped it, but anything less than creative control over what appears on their platform is effectively forcing corporations to share viewpoints they might not agree with. And legal speech clauses, etc have nothing to do with corporations. The notion of legal/free speech only comes from the federal government, as free and legal speech is only applicable to interactions between citizens and the government, not citizens and corporations, or citizens and citizens. And I’m not familiar with the town square concept as it relates to laws. If I set up a social media company with the ideals that black people have a safe space, an actual black twitter for example, I should be able to remove people who want to share views that are antithetical to a safe space for black people. I shouldn’t be forced to platform anti black ideas simply because my platform has a large user base. Ideas as simple as “black people don’t know science or math” are legal ideas to hold, but that doesn’t mean they should be welcome if I deem them harmful. Town square or not.