15

https://nonesense.substack.com/p/lesswrong-house-style

Given that they are imbeciles given, occasionally, to dangerous ideas, I think it’s worth taking a moment now and then to beat them up. This is another such moment.

top 9 comments
sorted by: hot top controversial new old
[-] swlabr@awful.systems 5 points 9 hours ago

Such a good post. LWers are either incapable of critical thought or self scrutiny, or are unwilling and think verbal diarrhea is a better choice.

[-] self@awful.systems 10 points 13 hours ago

This is obviously insane, the correct conclusion is that learning models cannot in fact be trained so hard that they will always get the next token correct. This is provable, and it’s not even hard to prove. It’s intuitively obvious, and a burly argument that backs the intuition is easy to build.

You do, however, have to approach it through analogies, through toy models. When you insist on thinking about the whole thing at once, you wind up essentially just saying things that feel right, things that are appealing. You can’t actually reason about the damned thing at all.

this goes a long way towards explaining why computer pseudoscience — like a fundamental ignorance of algorithmic efficiency and the implications of the halting problem — is so common and even celebrated among lesswrongers and other TESCREALs who should theoretically know better

[-] SubArcticTundra@lemmy.ml 2 points 13 hours ago

I'm out of the loop: what is lesswrong and why is it cringe?

[-] Soyweiser@awful.systems 4 points 3 hours ago

Rationalwiki (not affiliated with LW Rationalists, the opposite actually, op is a mod there) has a page on it. https://rationalwiki.org/wiki/Less_wrong

[-] captainlezbian@lemmy.world 2 points 50 minutes ago

That sounds like a religion insisting it isn’t one

[-] SubArcticTundra@lemmy.ml 2 points 2 hours ago
[-] Architeuthis@awful.systems 7 points 5 hours ago* (last edited 5 hours ago)

It's complicated.

It's basically a forum created to venerate the works and ideas of that guy who in the first wave of LLM hype had an editorial published in TIME where he called for a worldwide moratorium on AI research and GPU sales to be enforced with unilateral airstrikes, and whose core audience got there by being groomed by one the most obnoxious Harry Potter fanfictions ever written, by said guy.

Their function these days tends to be to provide an ideological backbone of bad scifi justifications to deregulation and the billionaire takeover of the state, which among other things has made them hugely influential in the AI space.

They are also communicating vessels with Effective Altruism.

If this piques your interest check the links on the sidecard.

[-] SubArcticTundra@lemmy.ml 1 points 2 hours ago

They are also communicating vessels with Effective Altruism.

I have a basic understanding of what EA is but what do you mean by communicating vessels?

[-] zbyte64@awful.systems 7 points 11 hours ago

They're Basically fanboys of whatever the latest cult is coming out of silicon valley.

this post was submitted on 30 Nov 2024
15 points (100.0% liked)

SneerClub

983 readers
23 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS