15
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 26 Mar 2026
15 points (89.5% liked)
Technology
83529 readers
493 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
This feels like an awful argument to make. It's not the presence of those things that make Meta and co so shit, it's the fact that they provably understood the risks and the effects that their design was having, knew that it was harming people, and continued to do it anyway. I don't care if we're talking about a little forum run by a Grandma and Grandpa talking about their jam recipes; if they know that they're causing harm and don't change their behavior, they should be liable.
"We designed, marketed, and sold the gun, but we didn't think anyone would use it."
It's like if someone had a forum where insurrectionists were discussing how to build bombs and where they were going to use them, and the owners had an internal meeting where they said, "Hey, we're hosting some pretty awful people, should we maybe report them or shut this down?" and the answer was, "Nah, they're paying users, and we want their money."
Pretty sure Section 230 wouldn't protect them, either.
https://youtu.be/ekg45ub8bsk?t=52
Entire clip: https://youtu.be/ekg45ub8bsk
Yeah this feels very much like, "censor content, but don't change Meta's practices"
Which begs the question, does the author know what they're cheering for?
The harm doesn't come from the aspects of infinite scroll, auto play, or algorithmic examples in a vacuum.
But we have statistically proven that when you gamify the system and the content can be considered harmful to consume too much of, those two factors are what makes it dangerous.
Tricking the brain into doing something harmful to itself by gamification is the problem. The algorithm, auto play and infinite scroll are just mechanisms to facilitate that. Novelty only plays a small part in that. The algorithm by itself doesn't provide a dopamine hit. The infinite scroll by itself doesn't provide a dopamine hit. The auto play feature by itself doesn't cause a dopamine hit.
Even when you combine all three the dopamine hit won't come if the content being pushed isn't sufficient to cause a rush of dopamine. And that dopamine rush often comes from things like upvotes and downvotes, and badges, and achievements. Follower counts and other metrics that the individual users use to get dopamine are being weaponized against them to make money. And it was intentional on the part of meta execs.
It's like he's describing a slot machine with unpainted wheels, leaving out the context that it's in a casino with a big "paint me and enjoy a share of the profit" sign above it.
The social media machine was designed to be a self-serve addiction generator. It intentionally used every trick it could legally get away with.
I don't know. Seems like self-control issues. People can get addicted to anything: shopping, sex, internet use, work, gaming, exercise. I also disagree with prohibitions on gambling, drug use, prostitution: it's their money, their body, etc.
Penalizing systems of communication & information delivery seems overreach. The harm seems phony & averted by basic self-control.
Addictive Personality is a proposed set of traits that makes sufferers more vulnerable to developing addictive behaviors, including things like gambling or social media. Does it help to frame it in a different light for you if you think of it as those companies exploiting vulnerable peoples' disorders to extract money from them?
Telling those people to just have self control is like telling someone with depression to just stop being sad.
Or telling someone stupid to be more clever, as the case may be.