-4
submitted 1 year ago* (last edited 1 year ago) by activistPnk@slrpnk.net to c/lemmy_support@lemmy.ml

The problem:

The web has obviously reached a high level of #enshitification. Paywalls, exclusive walled gardens, #Cloudflare, popups, CAPTCHAs, tor-blockades, dark patterns (esp. w/cookies), javascript that makes the website an app (not a doc), etc.

Status quo solution (failure):

#Lemmy & the #threadiverse were designed to inherently trust humans to only post links to non-shit websites, and to only upvote content that has no links or links to non-shit venues.

It’s not working. The social approach is a systemic failure.

The fix:

  • stage 1 (metrics collection): There needs to be shitification metrics for every link. Readers should be able to click a “this link is shit” button on a per-link basis & there should be tick boxes to indicate the particular variety of shit that it is.

  • stage 2 (metrics usage): If many links with the same hostname show a pattern of matching enshitification factors, the Lemmy server should automatically tag all those links with a warning of some kind (e.g. ⚠, 💩, 🌩).

  • stage 3 (inclusive alternative): A replacement link to a mirror is offered. E.g. youtube → (non-CF’d invidious instance), cloudflare → archive.org, medium.com → (random scribe.rip instance), etc.

  • stage 4 (onsite archive): good samaritans and over-achievers should have the option to provide the full text for a given link so others can read the article without even fighting the site.

  • stage 5 (search reranking): whenever a human post a link and talks about it, search crawlers notice and give that site a high ranking. This is why search results have gotten lousy -- because the social approach has failed. Humans will post bad links. So links with a high enshitification score need to be obfuscated in some way (e.g. dots become asterisks) so search crawlers don’t overrate them going forward.

This needs to be recognized as a #LemmyBug.

you are viewing a single comment's thread
view the rest of the comments
[-] activistPnk@slrpnk.net -5 points 1 year ago* (last edited 1 year ago)

The browser (more appropriately named: client) indeed needs some of the logic here, but it cannot do the full job I’ve outlined. The metrics need to be centralized. And specifically when you say browser, this imposes an inefficient amount of effort & expertise on the end-user. A dedicated client can make it easy on the user. But it’s an incomplete solution nonetheless.

[-] rglullis@communick.news 5 points 1 year ago

The metrics need to be centralized.

Why? And how would guarantee the integrity of the ones holding the metrics?

this imposes an inefficient amount of effort & expertise on the end-user.

A lot less effort than having to deal with the different "features" that each website admin decides to run on their own.

[-] activistPnk@slrpnk.net -4 points 1 year ago* (last edited 1 year ago)

Why?

  1. It’s a big database. It would be a poor design to replicate a db of all links in every single client.
  2. Synchronization of the db would not be cheap. When Bob says link X has anti-feature Y, that information must then be shared with 10s of thousands of other users.

Perhaps you have a more absolute idea of centralized. With Mastodon votes, they are centralized on each node but of course overall that’s actually decentralized. My bad. I probably shouldn’t have said centralized. I meant more centralized than a client-by-client basis. It’d be early to pin those details down at this point other than to say it’s crazy for each client to maintain a separate copy of that DB.

And how would guarantee the integrity of the ones holding the metrics?

The server is much better equipped than the user for that. The guarantee would be the same guarantee that you have with Mastodon votes. Good enough to be fit for purpose. For any given Mastodon poll everyone sees a subset of votes. But that’s fine. Perfection is not critical here. You wouldn’t want it to decide a general election, but you don’t need that level of integrity.

A lot less effort than having to deal with the different “features” that each website admin decides to run on their own.

That doesn’t make sense. Either one person upgrades their Lemmy server, or thousands of people have to install, configure, and maintain a dozen different browser plugins ported to a variety of different browsers (nearly impossible enough to call impossible). Then every Lemmy client also has to replicate that complexity.

this post was submitted on 07 Oct 2023
-4 points (47.1% liked)

Lemmy Support

4652 readers
1 users here now

Support / questions about Lemmy.

Matrix Space: #lemmy-space

founded 5 years ago
MODERATORS