39
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 29 Aug 2023
39 points (100.0% liked)
SneerClub
983 readers
1 users here now
Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
founded 1 year ago
MODERATORS
also, sincerely, can anyone explain to me what’s good about Yud’s writing? this shit is structured exactly like a goosebumps short except instead of being written by a likeable author targeting grade schoolers it’s written by some asshole who loves using concepts he doesn’t understand, targeting other assholes who don’t understand fucking anything because all their knowledge got filtered through Yud
I don't think there's anything good about the writing, but there's a few things that stand out ito mechanics employed and to which outcome effect they appear to be aiming
even this horrible essay pulled the infomercial "but wait, there's more!" at least 5 times. a terrible Plot Twist because he can't figure out how to layer his story devices any better
feel free!
edit: I read through the sequences three times: once on the site, once as an epub and once reading every post on LW main from 2007-2011 in order of posting. I can state that I have Done The Fucking Reading. The sequences finished in 2009, then you can see the site get weirder as people riff off them, up to the basilisk post in mid-2010. At that point everyone noticeably cools it on the weirdness and the site's haunted by a post nobody will talk out loud about. Then HPMOR takes off and the site has a new recruiting point.
Why would you do this
do you think i'd be here if i had good judgement
oh absolutely, and check out the ridiculous amount of ideological priming Yud does in this post. one example:
(and it’s very funny to me that a number of comments are “oh I had no idea this was about AI until the end…!”, how young are these kids you’re programming, Yud?)
in general, the ridiculous amount of slog going in combined with regular priming reminds me a lot of another sci-fi flavored cult I know, if you get my meaning
oh yeah the complexity and effort is almost certainly one of the points - people don't like to admit they got swindled or wasted their time, and ostensibly-clever people are just as capable of falling victim to this as others