784
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 17 Aug 2023
784 points (95.9% liked)
Technology
59623 readers
2198 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
Right, but since YouTube and Facebook are two of the most popular sites in the world, they aren't really just magnets for alt-right crazies, since they appeal to almost everybody.
right, but “everybody” aren’t the ones committing mass shootings all the time. that’s an alt-right crazies problem.
Ok so isn't the issue at hand whether the sites are to blame?
let’s break this down so I can answer you in what I think is an honest way:
and
and
1 - this is for a court to decide. I’m not familiar enough with the very specifics of case law or with the suits being brought to know exactly what is being alleged, etc. I can’t opine on this other that to say that, from what I do know, it’s unlikely that a court would hold these sites legally responsible.
2 - I fully believe that, yes, sites like these, massive, general-use public sites have a social and moral responsibility to keep their platforms safe. How and what that means is a matter for much debate, and I’m sure people here will do just that.
3 - is there overlap? again, legally, I’m not sure, but there might be, and in the near future, there might be much more. also, should there be more? another subject for debate.
I didn't say they were. Facebook and YouTube didn't commit the shootings, and there isn't anything particularly special about them that would disproportionately attract the alt-right crazies. They're not hate sites.
lmao… that’s a good one
YouTube’s algorithm seems to be funneling people to alt-right videos
Feeding Hate With Video: A Former Alt-Right YouTuber Explains His Methods
‘Carol’s Journey’: What Facebook knew about how it radicalized users
'It let white supremacists organize': the toxic legacy of Facebook's Groups
this is just scratching the surface…
a great video essay the subject:
The Alt-Right Playbook: How to Radicalize a Normie (CW/TW)
The Trump supporters like to bitch that Facebook has been censoring their opinions, especially during 2020 and 2021. They felt the same way about Twitter until Elon turned it into a hell hole.
Trump and his supporters complained that the 2016 election was rigged even after he won.
They’ll say anything to claim victimhood, most often when the opposite is true.
Yeah, I remember that. Unfortunately, I know Trump supporters who still think the elections were rigged.
They aren't being sued for being "magnets."