15
you are viewing a single comment's thread
view the rest of the comments
[-] KoboldCoterie@pawb.social 22 points 1 week ago

Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?

Of course not. Because infinite scroll is not inherently harmful. Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful. These features only matter because of the content they deliver. The “addictive design” does nothing without the underlying user-generated content that makes people want to keep scrolling.

This feels like an awful argument to make. It's not the presence of those things that make Meta and co so shit, it's the fact that they provably understood the risks and the effects that their design was having, knew that it was harming people, and continued to do it anyway. I don't care if we're talking about a little forum run by a Grandma and Grandpa talking about their jam recipes; if they know that they're causing harm and don't change their behavior, they should be liable.

[-] HeartyOfGlass@piefed.social 12 points 1 week ago

"We designed, marketed, and sold the gun, but we didn't think anyone would use it."

[-] KoboldCoterie@pawb.social 6 points 1 week ago

It's like if someone had a forum where insurrectionists were discussing how to build bombs and where they were going to use them, and the owners had an internal meeting where they said, "Hey, we're hosting some pretty awful people, should we maybe report them or shut this down?" and the answer was, "Nah, they're paying users, and we want their money."

Pretty sure Section 230 wouldn't protect them, either.

[-] Chulk@lemmy.ml 5 points 1 week ago

Yeah this feels very much like, "censor content, but don't change Meta's practices"

Which begs the question, does the author know what they're cheering for?

[-] atrielienz@lemmy.world 2 points 1 week ago

The harm doesn't come from the aspects of infinite scroll, auto play, or algorithmic examples in a vacuum.

But we have statistically proven that when you gamify the system and the content can be considered harmful to consume too much of, those two factors are what makes it dangerous.

Tricking the brain into doing something harmful to itself by gamification is the problem. The algorithm, auto play and infinite scroll are just mechanisms to facilitate that. Novelty only plays a small part in that. The algorithm by itself doesn't provide a dopamine hit. The infinite scroll by itself doesn't provide a dopamine hit. The auto play feature by itself doesn't cause a dopamine hit.

Even when you combine all three the dopamine hit won't come if the content being pushed isn't sufficient to cause a rush of dopamine. And that dopamine rush often comes from things like upvotes and downvotes, and badges, and achievements. Follower counts and other metrics that the individual users use to get dopamine are being weaponized against them to make money. And it was intentional on the part of meta execs.

[-] XLE@piefed.social 1 points 1 week ago

It's like he's describing a slot machine with unpainted wheels, leaving out the context that it's in a casino with a big "paint me and enjoy a share of the profit" sign above it.

The social media machine was designed to be a self-serve addiction generator. It intentionally used every trick it could legally get away with.

[-] lmmarsano@group.lt -2 points 1 week ago* (last edited 1 week ago)

I don't know. Seems like self-control issues. People can get addicted to anything: shopping, sex, internet use, work, gaming, exercise. I also disagree with prohibitions on gambling, drug use, prostitution: it's their money, their body, etc.

Penalizing systems of communication & information delivery seems overreach. The harm seems phony & averted by basic self-control.

[-] KoboldCoterie@pawb.social 2 points 1 week ago* (last edited 1 week ago)

Addictive Personality is a proposed set of traits that makes sufferers more vulnerable to developing addictive behaviors, including things like gambling or social media. Does it help to frame it in a different light for you if you think of it as those companies exploiting vulnerable peoples' disorders to extract money from them?

Telling those people to just have self control is like telling someone with depression to just stop being sad.

[-] hitmyspot@aussie.zone 3 points 1 week ago

Or telling someone stupid to be more clever, as the case may be.

this post was submitted on 26 Mar 2026
15 points (89.5% liked)

Technology

83529 readers
493 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS