The MIRI, CFAR, EA triumvirate promised not just that you could be the hero of your own story but that your heroism could be deployed in the service of saving humanity itself from certain destruction. Is it so surprising that this promise attracted people who were not prepared to be bit players in group housing dramas and abstract technical papers?
Good point.
Logic. Rationality. Intelligence. Somewhere in all these attempts to harness them for our shared humanity, they’d been warped and twisted to destroy it.
Oh, the warping and twisting started long before Ziz. (The Sequences are cult shit.)
That is interesting to think about. (Something feels almost defiant about imagining a future that has history books and PhD theses.) My own feeling is that Yudkowsky brought something much more overtly and directly culty. Kurzweil's vibe in The Age of Spiritual Machines and such was, as I recall, "This is what the scientists say, and this is why that implies the Singularity." By contrast, Yudkowsky was saying, "The scientists are insufficiently Rational to accept the truth, so listen to me instead. Academia bad, blog posts good." He brought a more toxic variation, something that emotionally resonated with burnout-trending Gifted Kids in a way that Kurzweil's silly little graphs did not. There was no Rationality as self-help angle in Kurzweil, no mass of text whose sheer bulk helped to establish an elect group of the saved.