19
top 32 comments
sorted by: hot top controversial new old
[-] fullsquare@awful.systems 6 points 23 hours ago

(I will appreciate if you NEVER TELL ANYONE I SAID THIS, not even in confidence. And by "appreciate", I mean that if you ever do, I'll probably either leave the Internet forever or seek some sort of horrible revenge.)

Taken literally, this seems like kind of a fucked up thing to say to a friend. Or a stranger. Anyone really. Why would you say this? Why would you write this in an email and then send it, on purpose, under any circumstance?

scott clearly thought that it was important to get that message out. idk what precisely happened there, but i'll risk a guess that perhaps scott thought that he found a partner in crime, so to speak, and secrecy would help them both. adversary would just use info as is. maybe the biggest thing scott could get in terms of blackmail was flimsy "okay, but you are into this thing too" which won't be effective in all cases, or maybe he didn't even had that

if one person came out and spilled the beans, it'd suggest that there might be more people who didn't

[-] CinnasVerses@awful.systems 1 points 2 hours ago* (last edited 2 hours ago)

I think Hallquist had a short-lived blog with criticisms of LessWrong and EA between his time on FreethoughtBlogs/Pantheos and his run for office and Medium blog. Possibly https://topherhallquist.wordpress.com/2015/08/17/reply-to-scott-alexander/ In the original Twitter post, Hallquist described Alexander as "a vague internet acquaintance at the time (when he sent the emails)" and it sounds like after 2014 Hallquist explored LW and EA and decided they were messed up. (Hallquist's post also suggests that if you want to sound like a bold contrarian, see what a median expert at a university think about a topic rather than reading blogs).

I like to remind myself that I see what happens online, but the offers of money and the sexual propositions probably mostly happen in person or between people who have met in person. So I don't know as much about the LessWrong or EA communities as I think.

[-] corbin@awful.systems 4 points 4 hours ago

[omitted a paragraph psychoanalyzing Scott]

I don't think that he was trying to make a threat. I think that he was trying to explain the difficulties of being a cryptofascist! Scott's entire grey-tribe persona collapses if he ever draws a solid conclusion; he would lose his audience if he shifted from cryptofascism to outright ethnonationalism because there are about twice as many moderates as fascists. Scott's grift only continues if he is skeptical and nuanced about HBD; being an open believer would turn off folks who are willing to read words but not to be hateful. His "appreciat[ion]" is wholly for his brand and revenue streams.

This also contextualizes the "revenge". If another content creator publishes these emails as part of their content then Scott has to decide how to fight the allegations. If the content is well-sourced mass-media journalism then Scott "leave[s] the Internet" by deleting and renaming his blog. If the content is another alt-right crab in the bucket then Scott "seek[s] some sort of horrible revenge" by attacking the rest of the alt-right as illiterate, lacking nuance, and unable to cite studies. No wonder he doesn't talk about us or to us; we're not part of his media strategy, so he doesn't know what to do about us.

In this sense, we're moderates too; none of us are hunting down Scott IRL. But that moderation is necessary in order to have the discussion in the first place.

[-] Architeuthis@awful.systems 4 points 14 hours ago* (last edited 13 hours ago)

if one person came out and spilled the beans, it’d suggest that there might be more people who didn’t

I mean, after his full throated defense of Lynn's IQ map (featuring disgraced nazi college dropout Cremieux/TP0 as a subject matter expert) what other beans might be interesting enough to spill? Did he lie about becoming a kidney donor?

I think the emails are important because a) they make a case that for all his performative high-mindedness and deference to science and whinging about polygenic selection he came to his current views through the same white supremacist/great replacement milieu as every other pretentious gutter racist out there and b) he is so consistently disingenuous that the previous statement might not even matter much... he might honestly believe that priming impressionable well-off techies towards blood and soil fascism precursors was worth it if we end up allowing unchecked human genetic experimentation to come up with 260IQ babies that might have a fighting chance against shAItan.

I guess it could come out that despite his habit of including conflict of interest disclosures, his public views may be way more for sale than is generally perceived.

[-] dgerard@awful.systems 2 points 12 hours ago

and it was 2014

[-] froztbyte@awful.systems 2 points 16 hours ago

if one person came out and spilled the beans, it’d suggest that there might be more people who didn’t

keep in mind these mails are years-disclosed by now - at least a few of them will have learned a lesson and gotten more careful

(I'm constantly glad that as many of them do not)

[-] Evinceo@awful.systems 3 points 18 hours ago

I'm reasonably certain worse emails exist but they're with sympathetic individuals.

[-] Evinceo@awful.systems 6 points 2 days ago

Even just transcribing these and hosting them is extremely helpful, thanks.

[-] Architeuthis@awful.systems 9 points 2 days ago* (last edited 2 days ago)

This was an excellent read if you're aware of the emails but never bothered to read his citations or to dig into what the blather about object-level and meta-level problems was specifically about, which is presumably most people.

So, a deeper examination of the email paints 2014 Siskind as a pretty run of the mill race realist who's really into black genes are dumber, you guys studies and who thinks that higher education institutions not taking them seriously means they are deeply broken and untrustworthy, especially with anything to do with pushing back against racism and sexism. Oh, and he is also very worried that immigration may destroy the West, or at least he gently urges you to get up to speed with articles coincidentally pushing that angle, and draw your own conclusions based on pure reason.

Also it seems that in private he takes seriously stuff he has already debunked in public, which makes it basically impossible to ever take anything he writes in good faith.

[-] excerpta@zirk.us 7 points 2 days ago

@Architeuthis @dgerard "...impossible to ever take anything he writes in good faith."

See also this unguarded moment from Tumblr. All the alpha is in bad faith social engineering!
https://www.reddit.com/r/SneerClub/comments/9lj3g7

[-] Architeuthis@awful.systems 10 points 2 days ago* (last edited 2 days ago)

I wonder if this is just a really clumsy attempt to invent stretching the overton window from first principles or if he really is so terminally rationalist that he thinks a political ideology is a sliding scale of fungible points and being 23.17% ancap can be a meaningful statement.

That the exchange of ideas between friends is supposed to work a bit like the principle of communicating vessels is a pretty weird assumption, too. Also, if he thinks it's ok to admit that he straight up tries to manipulate friends in this way, imagine how he approaches non-friends.

Between this and him casually admitting that he keeps "culture war" topics alive on the substack because they get a ton of clicks, it's a safe bet that he can't be thinking too highly of his readership, although I suspect there is an esoteric/exoteric teachings divide that is mostly non-obvious from the online perspective.

[-] AllNewTypeFace@leminal.space 8 points 2 days ago* (last edited 2 days ago)

So, by that token, if hypothetically you think that the Nazis got a few things right (not the war, racism or genocide, of course, or even the degenerate art, but maybe, say, the smoking bans and well-paved roads and perhaps the odd Wagnerian opera), the way to convince people is to start ranting about blood and soil and the need to exterminate the üntermenschen and wait for the nice normie liberals to politely meet you part of the way?

[-] CinnasVerses@awful.systems 6 points 2 days ago

In his early blog posts, Scott Alexander talked about how he was not leaping through higher education in a single bound (he went overseas for medical school, and failed to get medical residency on his first try, ending up in a small Midwestern city). So I wonder why he is sure that in a world with fewer university degrees, he would have gotten as far as he did (medical schools in the USA used to limit admissions from people of his ethnicity).

Likewise with immigration restrictions: he knows that they often blocked Jews, many Europeans. and East Asians not just brown people right?

[-] Architeuthis@awful.systems 8 points 2 days ago* (last edited 2 days ago)

In his early blog posts, Scott Alexander talked about how he was not leaping through higher education in a single bound

He starts his recent article on AI psychosis by mixing up psychosis with schizophrenia (he calls psychosis a biological disease), so that tracks.

Other than that, I think it's ok in principle to be ideologically opposed to something even if you and yours happened to benefit from it. Of course, it immediately becomes iffy if it's a mechanism for social mobility that you don't plan on replacing, since in that case you are basically advocating for pulling up the ladder behind you.

[-] fullsquare@awful.systems 6 points 2 days ago

He starts his recent article on AI psychosis by mixing up psychosis with schizophrenia (he calls psychosis a biological disease), so that tracks.

wait, this man is a psychiatrist? or is that another scott

[-] CinnasVerses@awful.systems 4 points 1 day ago* (last edited 1 day ago)

Yes, Scott Alexander is an unusual rationalist blogger who had a credentialed professional career as a psychiatrist. After Substack became his patron, he opened his own medical practice, but the website has said "not accepting new patients at this time" since 2022. So he seems to live off gifts from fellow travelers with a side hustle in psychiatry.

[-] fullsquare@awful.systems 7 points 1 day ago

i'll risk a guess that running ritalin-dispenser-as-a-service type business catering to overly confident rationalists might get him a pretty penny

[-] bigfondue@lemmy.world 4 points 1 day ago

Reading his adderall article I couldn't help but think that this guy is handing scripts to everyone in the Bay Area

[-] CinnasVerses@awful.systems 5 points 1 day ago

That is very possible although I would guess that was earlier in his career given that he does not advertise as treating ADHD or similar. He has two small children, a writing job, and side projects like writing end-of-the-world stories for AI 2027. His practice has a name drawn from Lord of the Rings like other things in the Thielsphere.

[-] dgerard@awful.systems 5 points 1 day ago

His practice has a name drawn from Lord of the Rings like other things in the Thielsphere.

fuckin lol, I had not spotted this, what a tell

[-] swlabr@awful.systems 3 points 1 day ago

Fatty Lumpkin's Headshrinking and Sundry?

[-] fullsquare@awful.systems 3 points 1 day ago

he had a blogpost about how amphetamines risks are overstated and it's fine actually for more people than usually prescribed https://slatestarcodex.com/2017/12/28/adderall-risks-much-more-than-you-wanted-to-know/

[-] dgerard@awful.systems 4 points 1 day ago* (last edited 1 day ago)

this post was the starting pistol for rationalists taking as much adderall as they could get down their necks. Scott is not telling you that adderall will make you a financial genius and super effective, you understand. Except Kelsey Piper, who he literally says this about by name.

This is the post that made it a rationalist commonplace that adderall makes anyone a super effective financial genius.

it's what got TPOT losers "microdosing" street meth. Of course the same TPOT confused meth and MDMA.

Somehow, Scott still has a license.

[-] fullsquare@awful.systems 4 points 1 day ago

ye, who are we to doubt superpredictors like them

[-] CinnasVerses@awful.systems 2 points 1 day ago* (last edited 1 day ago)

Making general statements about the risks and benefits of medication is different from proscribing them. The George K. Lerner, MD who was FTX's resident pill-pusher seems to be based in San Francisco and wants potential patients to know that inter alia "Dr. Lerner specializes in the treatment of Attention Deficit Disorder (ADD/ADHD) in adults. He has extensive experience in treating adults who have been successful in their professional endeavors but have found attention deficit symptoms to be an impediment to achieving their full potential." (nudge nudge)

His website does not mention a connection with the hospital in Michigan which is the only one where I know Alexander worked. I would like to know more about possible connections other than their mutual connections to the FTX gang. I have not done shoe-leather reporting in SoCal and almost all of the things we know about Alexander are things he posted voluntarily under his main handle.

Lerner's site shows what Alexander's site might look like if he were focused on psychiatry rather than writing and peddling racist lies.

[-] fullsquare@awful.systems 4 points 1 day ago

scott also has an explainer article for stimulants for ADHD where he tells that:

[...] This matches my experience. I’ve worked with a few hundred Adderall patients

so maybe he doesn't have to advertise lot, or at all https://lorienpsych.com/2020/10/30/adderall/

[-] dgerard@awful.systems 4 points 2 days ago

but look, i liked this article,

[-] swlabr@awful.systems 19 points 4 days ago* (last edited 4 days ago)

I’m midway through this and this part stood out to me, this is part of the email that was written by S.Al, edited for length:

Compare RationalWiki and the neoreactionaries. […] Almost nothing they say is outrageously wrong, but almost nothing they say is especially educational to someone who is smart enough to have already figured out that homeopathy doesn't work […] they fit exactly into my existing worldview without teaching me anything new

The Neoreactionaries provide a vast stream of garbage with occasional nuggets of absolute gold in them. Despite considering myself pretty smart and clueful, I constantly learn new and important things […] from the Reactionaries. Anything that gives you a constant stream of very important new insights is something you grab as tight as you can and never let go of.

The garbage doesn't matter because I can tune it out.

“Rational Wiki presents an understanding of reality that I mostly agree with and is thus boring. I want some spicy takes so I’m going to go suck on the firehose of reactionary diarrhoea, but it’s ok because my throat game is too good to let any of it through” type shit

[-] Evinceo@awful.systems 6 points 18 hours ago

“Rational Wiki presents an understanding of reality that I mostly agree with and is thus boring. I want some spicy takes so I’m going to go suck on the firehose of reactionary diarrhoea, but it’s ok because my throat game is too good to let any of it through” type shit

All time banger

[-] swlabr@awful.systems 3 points 17 hours ago
this post was submitted on 01 Sep 2025
19 points (100.0% liked)

SneerClub

1184 readers
26 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

founded 2 years ago
MODERATORS