[-] blakestacey@awful.systems 15 points 4 months ago

Oh, but they're just very concerned that American schools are forcing all the children into the same mold, don't you see

gag me with a fucking spoon

[-] blakestacey@awful.systems 15 points 6 months ago

The collapse of FTX also caused a reduction in traffic and activity of practically everything Effective Altruism-adjacent

Uh-huh.

[-] blakestacey@awful.systems 15 points 10 months ago

... "Coming of Age" also, oddly, describes another form of novel cognitive dissonance; encountering people who did not think Eliezer was the most intelligent person they had ever met, and then, more shocking yet, personally encountering people who seemed possibly more intelligent than himself.

The latter link is to "Competent Elities", a.k.a., "Yud fails to recognize that cocaine is a helluva drug".

I've met Jurvetson a few times. After the first I texted a friend: “Every other time I’ve met a VC I walked away thinking ‘Wow, I and all my friends are smarter than you.’ This time it was ‘Wow, you are smarter than me and all my friends.’“

Uh-huh.

Quick, to the Bat-Wikipedia:

On November 13, 2017, Jurvetson stepped down from his role at DFJ Venture Capital in addition to taking leave from the boards of SpaceX and Tesla following an internal DFJ investigation into allegations of sexual harassment.

Not smart enough to keep his dick in his pants, apparently.

Then, from 2006 to 2009, in what can be interpreted as an attempt to discover how his younger self made such a terrible mistake, and to avoid doing so again, Eliezer writes the 600,000 words of his Sequences, by blogging “almost daily, on the subjects of epistemology, language, cognitive biases, decision-making, quantum mechanics, metaethics, and artificial intelligence”

Or, in short, cult shit.

Between his Sequences and his Harry Potter fanfic, come 2015, Eliezer had promulgated his personal framework of rational thought — which was, as he put it, “about forming true beliefs and making decisions that help you win” — with extraordinary success. All the pieces seemed in place to foster a cohort of bright people who would overcome their unconscious biases, adjust their mindsets to consistently distinguish truth from falseness, and become effective thinkers who could build a better world ... and maybe save it from the scourge of runaway AI.

Which is why what happened next, explored in tomorrow’s chapter — the demons, the cults, the hells, the suicides — was, and is, so shocking.

Or not. See above, RE: cult shit.

[-] blakestacey@awful.systems 15 points 1 year ago

I love the "uh, you know we can see what you've done" reply.

[-] blakestacey@awful.systems 15 points 1 year ago

First, learn the difference between scorn or disdain and hate.

Second, read the comments in the thread already made about those "'sort of' correct" predictions.

[-] blakestacey@awful.systems 15 points 1 year ago* (last edited 1 year ago)

I can barely get past the image caption. "An AI made this". OK, and what did you ask it for, "random shit"?

And then there's the section that seems implicitly to be arguing that we should take the risk estimates made on "internet rationality forums" seriously because they totally called the COVID crisis, you guys... Well, they did a better job than an economist, anyway.

[-] blakestacey@awful.systems 15 points 2 years ago* (last edited 2 years ago)

Suppose there are five true heresies, but anyone who's on the record as believing more than one gets burned as a witch.

Two heresies leave Chicago traveling at 90 km/h and 100 km/h

Jessica asked if Yudkowsky denouncing neoreaction and the alt-right would still seem harmful, if he were to also to acknowledge, e.g., racial IQ differences?

uh

I agreed that that would be better, but realistically, I didn't see why Yudkowsky should want to poke that hornet's nest.

uhhhhhhhhh

[-] blakestacey@awful.systems 15 points 2 years ago

"If you take only the statements where I was vague instead of the ones where I was explicitly wrong and interpret my words in the way that I am now telling you to, you will see that I am right."

[-] blakestacey@awful.systems 15 points 2 years ago

my "not a cult" T-shirt has raised many questions, etc.

[-] blakestacey@awful.systems 15 points 2 years ago

“covalently bonded” bacteria

what an amazing theoretical possibility

[-] blakestacey@awful.systems 15 points 2 years ago

tmy;dr

(too much Yud; didn't read)

[-] blakestacey@awful.systems 15 points 2 years ago

I didn't expect that the repetition of a banal yet occasionally useful saying like "the map is not the territory" could make a person deserve being shoved into a locker, but life will surprise us all.

Mixed in with the rank, fetid ego are amusing indications that Yud gave very little thought to what Bayesian probability actually means. I find that entertaining.

view more: ‹ prev next ›

blakestacey

joined 2 years ago
MODERATOR OF