219
submitted 1 year ago by alyaza@beehaw.org to c/technology@beehaw.org

65% of Americans support tech companies moderating false information online and 55% support the U.S. government taking these steps. These shares have increased since 2018. Americans are even more supportive of tech companies (71%) and the U.S. government (60%) restricting extremely violent content online.

top 50 comments
sorted by: hot top controversial new old
[-] cygnus@lemmy.ca 38 points 1 year ago

Most Americans agree that false information should be moderated, but they disagree wildly on what's false or not.

[-] HairHeel@programming.dev 34 points 1 year ago

Y’all gonna regret this when Ron DeSantis gets put in charge of deciding which information is false enough to be deleted.

[-] NightAuthor@beehaw.org 33 points 1 year ago

What a slippery slippery slope….

[-] Shikadi@lemmy.sdf.org 45 points 1 year ago* (last edited 1 year ago)

Slippery slope to what? We have those restrictions for news already. Only reason you still see Fox and such lie on the air and get away with it is they're classified as entertainment instead of news. Freedom of speech and press are still in tact.

Edit: I wasn't referring to the Tucker Carlson case, but I did learn that's not true anyway. Nobody accredits news channels in the first place, and as it turns out, the FCC doesn't even have any authority over cable.

[-] alyaza@beehaw.org 30 points 1 year ago

Freedom of speech and press are still in tact.

there's also the detail that most countries do not have unabridged freedom of speech and, shockingly, are actually quite fine for not having it, so...

[-] argv_minus_one@beehaw.org 4 points 1 year ago* (last edited 1 year ago)

Neo-Nazi parties are getting ready to take over half of Europe and all of North America as we speak. No, they are not fine. They are very, very far from fine.

And when they do, laws like this will be used to stop anyone from dethroning the dictatorship and restoring democracy.

[-] alyaza@beehaw.org 14 points 1 year ago* (last edited 1 year ago)

And when they do, laws like this will be used to stop anyone from dethroning the dictatorship and restoring democracy.

this might be the most obvious non-sequitur i've ever seen—laws like "don't advocate for a second Holocaust" or "don't spread COVID misinformation" have literally no relation or causation to what far-right authoritarians believe or will do if they take power. the idea that this is what will empower them to smother democracy is on its face completely absurdist.

[-] Shikadi@lemmy.sdf.org 7 points 1 year ago

If they take over it doesn't matter what laws we have. Currently the republican frontrunner plans to expand the power of the president, and previously he packed the court with garbage. That's how they win, not by the government or companies working to fight misinformation.

[-] navigatron@beehaw.org 15 points 1 year ago

Who is the arbiter of truth? What prevents the power to censor from being abused?

The power to censor inherently includes the ability to silence its own opposition. Centralizing this power is therefore dangerous, as it is neigh impossible to regulate.

Currently, we can choose our forums - beehaw does a good job, /pol/ silences all but one worldview, and therefore I am here and not there. What happens when that choice is taken away, and one “truth” is applied universally, with no course for opposition?

Perhaps you believe you hold the correct opinions, and will not be affected. Only those who disagree with you will be silenced. Or perhaps you change your opinions to whatever you are told is correct, and therefore you do hold the correct opinions, though only by definition.

Consider that 50% of the country disagrees with you politically. If you follow a third party, it’s 98%. A forced shared truth is only “good” if it goes your way - but the odds of that are so incredibly small, and it gets much smaller when you consider infighting within the parties.

[-] alyaza@beehaw.org 23 points 1 year ago* (last edited 1 year ago)

Who is the arbiter of truth? What prevents the power to censor from being abused?

you're making an argument for absolutist freedom of speech here, because if you believe nobody can responsibly wield this power the obvious answer is nobody should—but you yourself literally admit by choice that you don't use absolutist freedom of speech places like /pol/ because of how they are and what they invariably turn into in the absence of censors. does that not tell you something about how self-defeating this position is

[-] navigatron@beehaw.org 8 points 1 year ago

No single body can wield this power, and therefore multiple should.

/pol/ self-censors through slides and sages, and even maintains at least some level of toxicity just to dissuade outsiders from browsing or posting - you could call it preventative censorship.

Fortunately, we don’t have to go there. We have the choice to coexist on Beehaw instead.

Even on reddit, different subs could have different moderation policies, and so if you didn’t like ex. Cyberpunk, you could go to lowsodium_cyberpunk.

Freedom to choose communities allows multiple diverse communities to form, and I think that’s the key - that there are many communities.

When the scope of truth arbitration moves from lemmy instances to the us gov, the only alternative choice for any who disagree would be to go to another country.

The beauty of the internet is that there are no countries. Any website could be anywhere - there are hundreds of thousands of choices, from twitter hashtags to irc rooms.

I do not want one hegemony of information. I do not want 5, or one for each nato member. I want as many as possible, so I may find one (or more!) that I like.

[-] alyaza@beehaw.org 12 points 1 year ago* (last edited 1 year ago)

No single body can wield this power, and therefore multiple should.

then you already exist in that world and for most countries a far more punitive model works better than the US's, so...

[-] navigatron@beehaw.org 9 points 1 year ago

So… what? Are you arguing for an expansion of “punitive models”?

Iraq has exceptional consistency in thought leadership. There are no drug addicts in Singapore.

Moxie marlinspike has an excellent blog post on “perfect enforcement” - if the law were applied perfectly, we would not have the lgbtq marriage rights we have today. If America had perfect consistency of thought, we would all be protestant catholic.

Consistency is not a world I strive for, and therefore, to return to the start of this thread, I do not believe the us gov should apply censorship to our communications, and I do believe that doing so would be a slippery slope, precisely and purely because censorship may prevent its own regulation.

[-] alyaza@beehaw.org 13 points 1 year ago* (last edited 1 year ago)

So… what? Are you arguing for an expansion of “punitive models”?

i mean yeah i very much am fine with the government saying "you can't say this" because i'm not a free speech absolutist and there are inarguable harms caused by certain forms of content being allowed to fester online. i'd personally quite like it if my country didn't make it legal to explicitly call for, plan for, and encourage people to exterminate all queer people—and i'd quite like it if corporations took that line as well. many countries have a line of this sort with no such problems, even though it is explicitly more punitive than the US model of "say whatever you want".

[-] dingus@lemmy.ml 14 points 1 year ago* (last edited 1 year ago)

I'm honestly shocked at the pushback for "Maybe we shouldn't let people preach things like 'X group of people needs to die because my God said so!' because it leads to unmitigated violence against the X group 99% of the time."

[-] Veraticus@lib.lgbt 8 points 1 year ago

right? It's pretty obvious for whom this argument is about theoretical free speech philosophizing, and for whom it is about actual survival.

load more comments (3 replies)
load more comments (2 replies)
[-] Veraticus@lib.lgbt 10 points 1 year ago

I don't understand how these questions are germane. We can and have already decided some speech is wrong to spread online and should lead to both deletion and arrest -- specifically child porn and terrorism. We can and have successfully defined what those are. What's wrong with adding misinformation and hate speech to this list? Do you really believe we'd have trouble defining those?

load more comments (6 replies)
[-] DonnieDarkmode@lemm.ee 6 points 1 year ago

they’re classified as entertainment instead of news

If you’re referencing the Tucker Carlson defamation suit, that’s not a correct reading of the decision.

load more comments (7 replies)
[-] NightAuthor@beehaw.org 6 points 1 year ago

Right, and it sounds like people want more restrictions. So it started with some reasonable restrictions baked into the bill of rights, and we’ve been losing rights at an alarming rate, so if people are already on board then I imagine we’ll get more restrictive speech legislation.

[-] alyaza@beehaw.org 18 points 1 year ago

So it started with some reasonable restrictions baked into the bill of rights, and we’ve been losing rights at an alarming rate,

...what? American freedom of speech has, if anything, gotten less restrictive over time and it never was restrictive to begin with. you quite literally have to go out of your way to utter something which isn't protected speech at this point (and the First Amendment has never covered private corporations so nobody is losing a "right" when Twitter tells you that you can't wish for a second Holocaust)

load more comments (2 replies)
load more comments (2 replies)
[-] knokelmaat@beehaw.org 26 points 1 year ago

I personally like transparent enforcement of false information moderation. What I mean by that is something similar to beehaw where you have public mod logs. A quick check is enough to get a vibe of what is being filtered, and in Beehaw's case they're doing an amazing job.

Mod logs also allow for a clear record of what happened, useful in case a person does not agree with the action a moderator took.

In that case it doesn't really matter if the moderators work directly for big tech, misuse would be very clearly visible and discontent people could raise awareness or just leave the platform.

[-] sub_@beehaw.org 24 points 1 year ago

65% of Americans support tech companies moderating false information online

aren't those tech companies the one who kept boosting false information on the first place to get ad revenue? FB did it, YouTube did it, Twitter did it, Google did it.

How about breaking them up into smaller companies first?

I thought the labels on potential COVID or election disinformation were pretty good, until companies stopped doing so.

Why not do that again? Those who are gonna claim that it's censorship, will always do so. But, what's needed to be done is to prevent those who are not well informed to fall into antivax / far-right rabbit hole.

Also, force content creators / websites to prominently show who are funding / paying them.

load more comments (1 replies)
[-] 001100010010@lemmy.dbzer0.com 19 points 1 year ago

Govenment censorship? No I don't support it (except the censoring of direct calls to violence, calls to violence should not be allowed)

Tech companies de-platforming you? Hell yea!

If you're having trouble finding a company that is willing to host your content, maybe your content is the issue.

[-] dingus@lemmy.ml 16 points 1 year ago

If the FCC can regulate content on television, they can regulate content on the internet.

The only reason the FCC doesn't is the Republican-dominated FCC when Ajit Pai was in charge argued that broadband is an "information service" and not a "telecommunications service" which is like the hair splittingest of splitting fucking hairs. It's fucking both.

Anyway, once it was classified as "information service" it became something the FCC (claimed it) didn't have authority to regulate in the same way, allowing them to gut net neutrality.

If they FCC changed the definition back to telecommunications, they wouldn't be able to regulate foreign websites, but they can easily regulate US sites and regulate entities who want to do business in the US using an internet presence.

[-] invno1@lemmy.one 15 points 1 year ago* (last edited 1 year ago)

I don't think this is really about censorship. You can say and advertise whatever you want, but after this if it can be proven false you have to pay the price. All it does is make people double check their facts and figures before they go shooting off random falsehoods.

[-] ConsciousCode@beehaw.org 11 points 1 year ago

What worries me is who defines what the truth is? Reality itself became political decades ago, probably starting with the existence of global warming and now such basic foundational facts as who won an election. If the government can punish "falsehood", what do you do if the GOP is in charge and they determine that "Biden won 2020" is such a falsehood?

[-] Nuuskis9@feddit.nl 8 points 1 year ago

This just another fake poll used to justify the biometrics requirement for internet connection.

[-] dingus@lemmy.ml 17 points 1 year ago* (last edited 1 year ago)

This statement reads like "I'm angry people don't want me using botnets to push my agenda."

Biometrics? Comcast didn't even ask me for a drivers license. They asked me for a credit card to make a payment.

Also, frankly, last I checked, Pew Research is pretty much unrivaled in social science polling data, so not sure why you're calling it "fake."

load more comments (3 replies)
[-] alyaza@beehaw.org 14 points 1 year ago

This just another fake poll used to justify the biometrics requirement for internet connection.

quelle surprise that your comment history is full of right-wing, crank, Great Reset (((globalist))) stuff:

Haha it is funny that you oppose only with the arguments you read from fact checkers instead of going to Youtube and see what those globalists says by themselves. In every country the top politicians goes regularly to their meetings who talks harsh things directly and all you know is that all they talked is a conspiracy theory. You always repeat the sentences who nobody actually said, expect fact checkers. Lol

Year 2030 is a global target for renovations in every aspects of societies and countries.

Let’s hope the successor of Rutte isn’t a WEF muppet and will stop the closure of farms.

Wow I didn’t realize you hollandaises love WEF-puppets, 15 min cities and Rutte’s lies even after he got nailed by Gideon van Meijeren in the parliament. Well, have fun with your 11 200 closed farmlands then. Luckily the puppets haven’t planned that in here, but most likely it’ll happen here too.

15 min cities as a consept was invented in the Soviet Union by Stalin.

[-] manpacket@lemmyrs.org 14 points 1 year ago

Do they agree on the definition of the false information?

[-] Veraticus@lib.lgbt 13 points 1 year ago* (last edited 1 year ago)

I certainly favor this and I hope online platforms will continue to remove misinformation and hate speech extremely vigorously.

I am also definitely okay with the government passing a law that requires online platforms to moderate themselves.

I don’t believe this impacts anyone’s freedom of speech. If you must make a Nazi website online, you are welcome to do so (assuming you can find a platform that doesn’t immediately remove it). But Facebook and Twitter should take down links and advertisements to your site, and even ban your account if you continue to talk about it.

Your freedom of speech is intact. No one is arresting you for what you are saying. But you aren’t guaranteed a platform on which to say it.

[-] PostmodernPythia@beehaw.org 12 points 1 year ago

The key is defining terms like “false” and “violent.”

[-] mrmanager@lemmy.today 12 points 1 year ago* (last edited 1 year ago)

Americans are generally quite stupid. Imagine asking private big tech to moderate what you can see online. :)

But it's the same in every country. The large masses are clueless. If you ask Europeans, you would get the same response even though it's Americans moderating it, which is even worse.

You know Microsoft is planning to put the next windows online and let people just access it? Same pattern here, people trusting big tech with their own privacy and integrity. So weird.

[-] beejjorgensen@lemmy.sdf.org 11 points 1 year ago

I have no problem with Twitter moderating content. The First Amendment says they can.

But the government moderating it--the First Amendment says they can't.

[-] pglpm@lemmy.ca 9 points 1 year ago* (last edited 1 year ago)

There are surely pros and cons, possibly good and possibly bad outcomes with such restrictions, and the whole matter is very complicated.

From my point of view part of the problem is the decline of education and of teaching rational and critical thinking. Science started when we realized and made clear that truth – at least scientific truth – is not about some "authority" (like Aristotle) saying that things are so-and-so, or a majority saying that things are so-and-so. Galilei said this very clearly:

But in the natural sciences, whose conclusions are true and necessary and have nothing to do with human will, one must take care not to place oneself in the defense of error; for here a thousand Demostheneses and a thousand Aristotles would be left in the lurch by every mediocre wit who happened to hit upon the truth for himself.

The problem is that today we're relegating everything to "experts", or more generally, we're expecting someone else to apply critical thinking in our place. Of course this is unavoidable to some degree, but I think the situation could be much improved from this point of view.

[-] rodbiren@midwest.social 6 points 1 year ago

It's all fun and games till well intentioned laws get abused by a new administration. Be careful what you wish for. My personal take is that any organization that is even reasonably similar to a news site must conform to fairness in reporting standards much like broadcast TV once had. If you don't, but an argument could be made that you present as a new site, you just slap a sizeable banner on every page that you are an entertainment site. Drawing distinctions on what is news and what is entertainment would theoretically work better than an outright ban of misleading content.

At the end of the day it won't matter what is written unless the regulations have actual teeth. "Fines" mean so little given the billion dollar backers could care less and retractions are too little too late. I want these wannabe Nazi "News Infotainment" people to go to jail for their speech that causes harm to people and the nation. Destroying democracy should be painful for the agitators.

[-] Veraticus@lib.lgbt 5 points 1 year ago

Eh, people are trying to do this already by claiming that queer content in real life and on the Internet are "grooming" kids. We can push back against mis-application of laws without saying the laws themselves shouldn't exist.

[-] argv_minus_one@beehaw.org 6 points 1 year ago

But how do you implement such a thing without horrible side effects?

[-] wxboss@lemmy.sdf.org 6 points 1 year ago

Most Americans don't want to think for themselves. They would rather someone else do that heavy lifting for them.

However, it's important that people have the freedom to reason for themselves and make choices accordingly without some governmental entity mandating a certain thought trajectory. People shouldn't surrender such fundamental human freedoms to their government.

“If liberty means anything at all it means the right to tell people what they do not want to hear.” ― George Orwell, Animal Farm

[-] OneRedFox@beehaw.org 5 points 1 year ago

Checks out. I wouldn't want the US government doing it, but deplatforming bullshit is the correct approach. It takes more effort to reject a belief than to accept it and if the topic is unimportant to the person reading about it, then they're more apt to fall victim to misinformation.

Although suspension of belief is possible (Hasson, Simmons, & Todorov, 2005; Schul, Mayo, & Burnstein, 2008), it seems to require a high degree of attention, considerable implausibility of the message, or high levels of distrust at the time the message is received. So, in most situations, the deck is stacked in favor of accepting information rather than rejecting it, provided there are no salient markers that call the speaker’s intention of cooperative conversation into question. Going beyond this default of acceptance requires additional motivation and cognitive resources: If the topic is not very important to you, or you have other things on your mind, misinformation will likely slip in.

Additionally, repeated exposure to a statement increases the likelihood that it will be accepted as true.

Repeated exposure to a statement is known to increase its acceptance as true (e.g., Begg, Anas, & Farinacci, 1992; Hasher, Goldstein, & Toppino, 1977). In a classic study of rumor transmission, Allport and Lepkin (1945) observed that the strongest predictor of belief in wartime rumors was simple repetition. Repetition effects may create a perceived social consensus even when no consensus exists. Festinger (1954) referred to social consensus as a “secondary reality test”: If many people believe a piece of information, there’s probably something to it. Because people are more frequently exposed to widely shared beliefs than to highly idiosyncratic ones, the familiarity of a belief is often a valid indicator of social consensus.

Even providing corrections next to misinformation leads to the misinformation spreading.

A common format for such campaigns is a “myth versus fact” approach that juxtaposes a given piece of false information with a pertinent fact. For example, the U.S. Centers for Disease Control and Prevention offer patient handouts that counter an erroneous health-related belief (e.g., “The side effects of flu vaccination are worse than the flu”) with relevant facts (e.g., “Side effects of flu vaccination are rare and mild”). When recipients are tested immediately after reading such hand-outs, they correctly distinguish between myths and facts, and report behavioral intentions that are consistent with the information provided (e.g., an intention to get vaccinated). However, a short delay is sufficient to reverse this effect: After a mere 30 minutes, readers of the handouts identify more “myths” as “facts” than do people who never received a hand-out to begin with (Schwarz et al., 2007). Moreover, people’s behavioral intentions are consistent with this confusion: They report fewer vaccination intentions than people who were not exposed to the handout.

The ideal solution is to cut off the flow of misinformation and reinforce the facts instead.

load more comments
view more: next ›
this post was submitted on 23 Jul 2023
219 points (100.0% liked)

Technology

37739 readers
942 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS