[-] TinyTimmyTokyo@awful.systems 7 points 2 months ago

Lots of discussion on the orange site post about this today.

(I mentioned this in the other sneerclub thread on the topic but reposted it here since this seems to be the more active discussion zone for the topic.)

[-] TinyTimmyTokyo@awful.systems 5 points 2 months ago

The story has now hopped to the orange site. I was expecting a shit-show, but there have been a few insightful comments from critics of the rationalists. This one from "rachofsunshine" for instance:

[Former member of that world, roommates with one of Ziz's friends for a while, so I feel reasonably qualified to speak on this.]

The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.

As relevant here:

  1. While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to...

  2. Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is "justified" to prevent a speck of dust in the eye of eternity. When the thing you're trying to create is infinitely good or the thing you're trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.

  3. Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.

  4. The nature of being a "freethinker" is that you're at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you'll get stuck in it, because there's no external "drag" or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you've got a culty environment that is particularly susceptible to internally-consistent madness, and finally:

  5. It's a bunch of very weird people who have nowhere else they feel at home. I totally get this. I'd never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There's some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)

TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz's group is only one of several.

[-] TinyTimmyTokyo@awful.systems 6 points 8 months ago

But will my insurance cover a visit to Dr. Spicy Autocomplete?

[-] TinyTimmyTokyo@awful.systems 8 points 1 year ago

Glowfic feels like a writing format designed in a lab to be the perfect channel for Eliezer's literary diarrhea.

[-] TinyTimmyTokyo@awful.systems 8 points 1 year ago

All factual and counter-factual statements are evidence for my position. Heads I win, tails you lose.

[-] TinyTimmyTokyo@awful.systems 7 points 1 year ago

"Fucking probabilities, how do they work?"

[-] TinyTimmyTokyo@awful.systems 6 points 1 year ago

I like the way he thinks the lack of punctuation in his "joke" is the tell that it's a joke.

He's also apparently never heard the aphorism that if you have to explain the joke, it's probably not that funny.

[-] TinyTimmyTokyo@awful.systems 6 points 2 years ago

Here's a link to the original.

[-] TinyTimmyTokyo@awful.systems 7 points 2 years ago

My attention span is not what it used to be, and I couldn't force myself to get to the end of this. A summary or TLDR (on the part of the original author) would have been helpful.

What is it with rationalists and their inability to write with concision? Is there a gene for bloviation that also predisposes them to the cult? Or are they all just mimicking Yud's irritating style?

[-] TinyTimmyTokyo@awful.systems 8 points 2 years ago

Is it wrong to hope they manage to realize one of these libertarian paradise fantasies? I'd really love to see how quickly it devolves into a Mad Max Thunderdome situation.

[-] TinyTimmyTokyo@awful.systems 7 points 2 years ago

What's it like to be so good at PR?

[-] TinyTimmyTokyo@awful.systems 6 points 2 years ago* (last edited 2 years ago)

This is good:

Take the sequence {1,2,3,4,x}. What should x be? Only someone who is clueless about induction would answer 5 as if it were the only answer (see Goodman’s problem in a philosophy textbook or ask your closest Fat Tony) [Note: We can also apply here Wittgenstein’s rule-following problem, which states that any of an infinite number of functions is compatible with any finite sequence. Source: Paul Bogossian]. Not only clueless, but obedient enough to want to think in a certain way.

Also this:

If, as psychologists show, MDs and academics tend to have a higher “IQ” that is slightly informative (higher, but on a noisy average), it is largely because to get into schools you need to score on a test similar to “IQ”. The mere presence of such a filter increases the visible mean and lower the visible variance. Probability and statistics confuse fools.

And:

If someone came up w/a numerical“Well Being Quotient” WBQ or “Sleep Quotient”, SQ, trying to mimic temperature or a physical quantity, you’d find it absurd. But put enough academics w/physics envy and race hatred on it and it will become an official measure.

view more: ‹ prev next ›

TinyTimmyTokyo

joined 2 years ago