view the rest of the comments
United Kingdom
General community for news/discussion in the UK.
Less serious posts should go in !casualuk@feddit.uk or !andfinally@feddit.uk
More serious politics should go in !uk_politics@feddit.uk.
Try not to spam the same link to multiple feddit.uk communities.
Pick the most appropriate, and put it there.
Posts should be related to UK-centric news, and should be either a link to a reputable source, or a text post on this community.
Opinion pieces are also allowed, provided they are not misleading/misrepresented/drivel, and have proper sources.
If you think "reputable news source" needs some definition, by all means start a meta thread.
Posts should be manually submitted, not by bot. Link titles should not be editorialised.
Disappointing comments will generally be left to fester in ratio, outright horrible comments will be removed.
Message the mods if you feel something really should be removed, or if a user seems to have a pattern of awful comments.
Honestly we should probably regulate these algorithms in general. People like Andrew Tate are a problem, but not the only problem.
My mother went down a conspiracy rabbit hole and never came back out again. You’d be surprised how short the pipeline from gardening, to arts and crafts, to crunchiness, to antisemitism, homophobia, misogyny, new world orders, and all that bs is.
The big question is how? The algorithms aren't the root cause of the problem, they are just amplifying natural human behaviour.
People have always fallen down these rabbit holes and any algorithm based on predicting what a person will be interested in will suffer a similar problem. How can you regulate what topics a person is interested in?
My theory is society has a suppressing affect on these things.. It's not nice to be a nazi, or to mistreat people you don't like, so these things get hidden.
Algorithms do the opposite. Now someone with Nazi tendencies is surrounded by them and encouraged. Posts hating trans people get pushed by algorithms because they drive engagement (even if all the initial responses are negative, it's still engagement to the algorithm, which will then boost the 'popular' post).
Things like lemmy and mastodon don't do that and end up nicer places as a result.
You're my mostly right about society but the problem is not algorithms, it's echo-chambers. The KKK wasn't driven by an algorithm but still radicalised people in the same way - once you're able to find a bubble within society that accepts your views, it's very easy for your views to grow more extreme. Doesn't matter whether that's fascism, racism, communism, no-fap or hydrohomies - the mechanisms work the same way.
Reddit was arguably no more algorithm-led than Lemmy or Mastodon, but that hasn't prevented the rise of a whole list of hate-fueled subs over there. The root problem is that people with Nazi tendancies find pro-nazi content engaging. The algorithm isn't pushing it upon them, it's just delivering what they want.