I'm pretty sure that nobody has ever lain on their deathbed wishing they'd spent more time going into online communities and disagreeing with everything being said.
There was just recently a dust-up wherein authors quit a romance con because it was to feature someone who published a Harry Potter fic with the serial numbers filed off, and supporting anything that keeps the brand going is putting money in Rowling's pocket and thus actively making trans people's lives worse. People care about this kind of thing; at least, some of 'em do. There are reactions. Some of those are talk about "reclaiming the fandom", while others regard that as untenable self-justification... But any way you slice it, the subject is very clearly coming up.
In the year since the Neil Gaiman unpleasantness dropped, I've lost count of all the threads where people have said that they can't enjoy his work again, that they are painfully re-evaluating their relationship with Sandman or Coraline or American Gods. They can't help but engage with the subject. And, hey, I get it! I generally liked his stuff and saw him live at a few events over the years, where he was an enjoyable public speaker. I don't have a Death tattoo that now needs covering up, but I can still register a loss. Discovering Sandman while visiting a friend on vacation when they were checking it out of the public library... that was an uncomplicatedly happy memory!
This kind of thing grips a person and compels a response. Even if that's only a self-justifying rationalization of the status quo! But Yudkowsky (to my knowledge) has said nothing, none of the lesswrongs commenting on that interview said anything... I expected something, like a "Rational!Harry is the only canon now", or a "Methods of Rationality is the greatest fic to be based on the works of Hatsune Miku". Anything, you know? But I haven't even seen the step that elsewhere would be the bare minimum.
So, the chatter over on Reddit sneerclub is that the author is a transphobic sex pest
https://www.reddit.com/r/SneerClub/comments/1maslci/sam_krissagainst_truth_also_against_rationalism/
I only know about the latter
You might think that this review of Yud's glowfic is an occasion for a "read a second book" response:
Yudkowsky is good at writing intelligent characters in a specific way that I haven't seen anyone else do as well.
But actually, the word intelligent is being used here in a specialized sense to mean "insufferable".
Take a stereotypical fantasy novel, a textbook on mathematical logic, and Fifty Shades of Grey.
Ah, the book that isn't actually about kink, but rather an abusive relationship disguised as kink — which would be a great premise for an erotic thriller, except that the author wasn't sufficiently self-aware to know that's what she was writing.
Abstract: This paper presents some of the initial empirical findings from a larger forth-coming study about Effective Altruism (EA). The purpose of presenting these findings disarticulated from the main study is to address a common misunderstanding in the public and academic consciousness about EA, recently pushed to the fore with the publication of EA movement co-founder Will MacAskill’s latest book, What We Owe the Future (WWOTF). Most people in the general public, media, and academia believe EA focuses on reducing global poverty through effective giving, and are struggling to understand EA’s seemingly sudden embrace of ‘longtermism’, futurism, artificial intelligence (AI), biotechnology, and ‘x-risk’ reduction. However, this agenda has been present in EA since its inception, where it was hidden in plain sight. From the very beginning, EA discourse operated on two levels, one for the general public and new recruits (focused on global poverty) and one for the core EA community (focused on the transhumanist agenda articulated by Nick Bostrom, Eliezer Yudkowsky, and others, centered on AI-safety/x-risk, now lumped under the banner of ‘longtermism’). The article’s aim is narrowly focused onpresenting rich qualitative data to make legible the distinction between public-facing EA and core EA.
They take a theory that is supposed to be about updating one's beliefs in the face of new evidence, and they use it as an excuse to never change what they think.
This is why my crimes.txt file just contains the recipes that I really should not try making, like Jake Morgendorffer's chile con cheesepuffs with fresh mint, and my actual crime plans are in... oh ho, I see what you did there, you clever jack-a-napes!
Yud writing about math is the worst. You get your autodidact problems, because he's never been tested on actually doing calculations. He's always graded his own homework, as it were; all his experience is in rhetorically weaseling out of his mistakes, instead of learning from the red pen. Then you get all the problems that come from "splurging a first draft" out upon his fandom. They miss the mistakes among his meanderings. Quite likely, they lack the experience to detect them, but the beigeness of his prose helps to obscure them anyway. The fan will interpret any confusion as being their own fault, not Yud's, or just dismiss any lack of clarity because the feeling of being special feels so good. So, even if Yud were inclined to learn from meaningful criticism, he's not getting any.
Struggling through Yud's attempt at explaining a basic calculation in quantum mechanics is like reading algebra problems from before algebraic notation was invented.
When the cube with the cose beside it
Equates itself to some other whole number,
Find two others, of which it is the difference.Hereafter you will consider this customarily
That their product always will be equal
To the third of the cube of the cose net.Its general remainder then
Of their cube sides, well subtracted,
Will be the value of your principal unknown.
lakes and seas of people
clearly the AI is going to hug us all and then we turn into TANG
Enclosed please find one (1) Internet, awarded in recognition of the best/worst mental image I've had all week
I've more than once been tempted to write Everything the Sequences Get Wrong about Quantum Mechanics, but the challenge is doing so in a way that doesn't just amount to teaching a whole course in quantum mechanics. The short-short version is that it's lazy, superficial takes on top of cult shit — Yud trying to convince the reader that the physics profession is broken and his way is superior.
The sewer-deep Islamophobia from "luminaries" like Richard Dawkins didn't help, either. One thing that is perhaps easy to miss now in looking back at "New Atheism" is how much it inhabited a shortly after 9/11 cultural space.
And regarding the point above that the analysis needs "Explicit acknowledgement of the role of capitalism and colonialist tendencies in corrupting subcultures", the term New Atheism itself was a branding gimmick imposed from outside (codified by and perhaps first used in Wired magazine, of all places, AFAIK). The people who were already "in" it looked around and asked, "OK, what exactly is new about it?". As far as actual arguments went, there was little if anything that Paul Dirac had not already said in 1927.
Shermer is a "sociopath" in the GMS taxonomy. But he rose to prominence in the '90s, co-founding the Skeptics Society in 1991 and publishing Why People Believe Weird Things in 1997. He was considered the old guard by those who came to skepticism/atheism via the '00s blogosphere, who were some combination of "geeks" and "mops". So, there's not really the linear order to it that the neat and tidy GMS story calls for.