[-] TinyTimmyTokyo@awful.systems 3 points 2 months ago

I should probably mention that this person went on to write other comments in the same thread, revealing that they're still heavily influenced by Bay ?Area rationalism (or what one other commenter brilliantly called "ritual multiplication").

[-] TinyTimmyTokyo@awful.systems 5 points 2 months ago

The story has now hopped to the orange site. I was expecting a shit-show, but there have been a few insightful comments from critics of the rationalists. This one from "rachofsunshine" for instance:

[Former member of that world, roommates with one of Ziz's friends for a while, so I feel reasonably qualified to speak on this.]

The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.

As relevant here:

  1. While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to...

  2. Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is "justified" to prevent a speck of dust in the eye of eternity. When the thing you're trying to create is infinitely good or the thing you're trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.

  3. Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of "anyone who would criticize us for any reason is a bad person who is lying to cause us harm". That kind of framing can't help but get culty.

  4. The nature of being a "freethinker" is that you're at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you'll get stuck in it, because there's no external "drag" or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you've got a culty environment that is particularly susceptible to internally-consistent madness, and finally:

  5. It's a bunch of very weird people who have nowhere else they feel at home. I totally get this. I'd never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There's some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)

TLDR: isolation, very strong in-group defenses, logical "doctrine" that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz's group is only one of several.

[-] TinyTimmyTokyo@awful.systems 4 points 1 year ago

Sorry for the off-topic rant, but WTF is Emile Torres doing on twitter? Anytime I see someone creating content for that Nazi hellsite, I start looking at them differently.

[-] TinyTimmyTokyo@awful.systems 5 points 1 year ago

I'd really like to know the back story on this interview too. I realize weirdness isn't exactly distinctive when it comes to rationalists, but Zack is in a league of his own.

[-] TinyTimmyTokyo@awful.systems 3 points 2 years ago* (last edited 2 years ago)

Lots of fascinating links in this article. This link in particular was fascinating:

If you're searching for Scott Siskind... I am Scott Siskind from Ann Arbor, Michigan. There used to be more things on this webpage. Right now I'm using it to spread the message that there are multiple statements being falsely attributed to me on the Internet. Somebody who doesn't like me - I am not sure who, but I work in mental health and guess this is sort of a professional hazard - has been trying to systematically discredit me by posting racist and profanity-laden things under my name. Some of the comments make some effort to convince, like linking back to my website. The end result is that if you Google me to try to find out what I am like, you will probably end up seeing angry racist profanity-laden comments made under my name. These are not mine.

Does anyone know the backstory here? This reads to me like a "hackers ate my password" story -- the kind of ass-covering someone might concoct after their racist writings accidentally leaked onto the internet.

EDIT: This seems to be related to the stuff Topher Brennan revealed? Except it was written many years before Topher's revelations. It's confusing...

[-] TinyTimmyTokyo@awful.systems 6 points 2 years ago

Here's a link to the original.

[-] TinyTimmyTokyo@awful.systems 4 points 2 years ago

That reminds me. If the world is about to FOOM into a kill-all-humans doomscape, why is he wasting time worrying about seed oils?

[-] TinyTimmyTokyo@awful.systems 3 points 2 years ago

Stephen Jay Gould's The Mismeasure of Man is always a good place to start.

[-] TinyTimmyTokyo@awful.systems 3 points 2 years ago

Random blue check spouts disinformation about "seed oils" on the internet. Same random blue check runs a company selling "safe" alternatives to seed oils. Yud spreads this huckster's disinformation further. In the process he reveals his autodidactically-obtained expertise in biology:

Are you eating animals, especially non-cows? Pigs and chickens inherit linoleic acid from their feed. (Cows reprocess it more.)

Yes, Yud, because that's how it works. People directly "inherit" organic molecules totally unmetabolized from the animals they eat.

I don't know why Yud is fat, but armchair sciencing probably isn't going to fix it.

[-] TinyTimmyTokyo@awful.systems 2 points 2 years ago* (last edited 2 years ago)

In theory, a prediction market can work. The idea is that even though there are a lot of uninformed people making bets, their bad predictions tend to cancel each other out, while the subgroup of experts within that crowd will converge on a good prediction. The problem is that prediction markets only work when they're ideal. As soon as the bettor pool becomes skewed by a biased subpopulation, they stop working. And that's exactly what happens with the rationalist crowd. The main benefit rationalists obtain from prediction markets and wagers is an unfounded confidence that their ideaas have merit. Prediction markets also have a long history in libertarian circles, which probably also helps explain why rationalists are so keen on them.

[-] TinyTimmyTokyo@awful.systems 1 points 2 years ago

If Books Could Kill is great. I believe the first podcast was about Freakonomics, another one of those incredibly popular books based on behavioral economics. They took it apart.

view more: ‹ prev next ›

TinyTimmyTokyo

joined 2 years ago