80

It's the Guardian, but it's still a good read. All of Sneerclub's favorite people were involved.

Last weekend, Lighthaven was the venue for the Manifest 2024 conference, which, according to the website, is “hosted by Manifold and Manifund”. Manifold is a startup that runs Manifund, a prediction market – a forecasting method that was the ostensible topic of the conference.

Prediction markets are a long-held enthusiasm in the EA and rationalism subcultures, and billed guests included personalities like Scott Siskind, AKA Scott Alexander, founder of Slate Star Codex; misogynistic George Mason University economist Robin Hanson; and Eliezer Yudkowsky, founder of the Machine Intelligence Research Institute (Miri).

Billed speakers from the broader tech world included the Substack co-founder Chris Best and Ben Mann, co-founder of AI startup Anthropic. Alongside these guests, however, were advertised a range of more extreme figures.

One, Jonathan Anomaly, published a paper in 2018 entitled Defending Eugenics, which called for a “non-coercive” or “liberal eugenics” to “increase the prevalence of traits that promote individual and social welfare”. The publication triggered an open letter of protest by Australian academics to the journal that published the paper, and protests at the University of Pennsylvania when he commenced working there in 2019. (Anomaly now works at a private institution in Quito, Ecuador, and claims on his website that US universities have been “ideologically captured”.)

Another, Razib Khan, saw his contract as a New York Times opinion writer abruptly withdrawn just one day after his appointment had been announced, following a Gawker report that highlighted his contributions to outlets including the paleoconservative Taki’s Magazine and anti-immigrant website VDare.

The Michigan State University professor Stephen Hsu, another billed guest, resigned as vice-president of research there in 2020 after protests by the MSU Graduate Employees Union and the MSU student association accusing Hsu of promoting scientific racism.

Brian Chau, executive director of the “effective accelerationist” non-profit Alliance for the Future (AFF), was another billed guest. A report last month catalogued Chau’s long history of racist and sexist online commentary, including false claims about George Floyd, and the claim that the US is a “Black supremacist” country. “Effective accelerationists” argue that human problems are best solved by unrestricted technological development.

Another advertised guest, Michael Lai, is emblematic of tech’s new willingness to intervene in Bay Area politics. Lai, an entrepreneur, was one of a slate of “Democrats for Change” candidates who seized control of the powerful Democratic County Central Committee from progressives, who had previously dominated the body that confers endorsements on candidates for local office.

all 32 comments
sorted by: hot top controversial new old
[-] V0ldek@awful.systems 28 points 5 months ago

Sam Bankman-Fried funded a group with racist ties

Ye, I know.

Not that one.

Oh.

Not that one either.

Jesus christ, how many of them are there??

[-] gerikson@awful.systems 13 points 5 months ago

All of them, Jim. All of them.

[-] dgerard@awful.systems 25 points 5 months ago* (last edited 5 months ago)

this is Lightcone, hosts of the totally not race science convention, falling afoul of the FTX bankruptcy

I’m not quoted in the story, but I did supply a pile of background for it. Authors are Jason Wilson and Ali Winston, who spend a lot of time chasing neo-Nazis for the Guardian US.

original URL: https://www.theguardian.com/technology/article/2024/jun/16/sam-bankman-fried-ftx-eugenics-scientific-racism

[-] sailor_sega_saturn@awful.systems 19 points 5 months ago

We offer cozy nooks with firepits, discussion rooms with endless whiteboards, and up to 44 bedrooms (with up to 80 beds).

Not a cult.

Lighthaven is a space dedicated to hosting events and programs that help people think better and to improve humanity's long-term trajectory.

Definitely not a cult.

Humanity's future could be vast, spanning billions of flourishing galaxies, reaching far into our future light cone [1]. However, it seems humanity might never achieve this; we might not even survive the century. To increase our odds, we build services and infrastructure for people who are helping humanity navigate this crucial period. [1]: Or even more than the light cone, depending on how the acausal trade stuff works out.

Have we mentioned how very much not a cult we are?

[-] dgerard@awful.systems 13 points 5 months ago* (last edited 5 months ago)

The services and infrastructure: hosting a web forum

edit: sorry, three web forums

[-] sue_me_please@awful.systems 5 points 5 months ago

Thank you for doing god's work

[-] mawhrin@awful.systems 4 points 5 months ago* (last edited 5 months ago)

that was noticed by the gobshites and they're not happy about it, i think the tracingwoodgrains person really dislikes you:

I respect that and agree that those comments cross a line that should not be crossed. I'm sympathetic to the value of red lines and taboos, and I regularly put active effort into defending the sentiment that racism is bad and should be condemned (though I am extremely cautious about tabooing people as a whole based on specific bad sentiments).

It's more complicated for me here because as mentioned above, I find Hanania's commentary on other topics unusually valuable and think I have had valuable, worthwhile interactions with him such that I am glad for opportunities to do so.

More than that, I am conscious that many who most eagerly pursue the taboo, including the writers of the Guardian article and people like David Gerard who provided background for it openly despise you, me, and others in these spheres, and given taboo-crafting power would craft a set of norms emphatically disagreeable to me. I think parts of the EA community have themselves shown some susceptibility to similar impulses, throwing people like Nick Bostrom under the bus to do so. That post in particular actively made me more wary of EA spaces and left me wondering who else would be skewered.

The individual who wrote that post no longer works at CEA but openly demands that EA cut ties with the entire rationalist community. I like you and broadly trust your own instincts here, even where we might disagree about where to draw specific lines, but I am extremely wary of yielding norm-setting power to people who treat my approach (engaging seriously with anyone) as worthy of suspicion and condemnation, and I think when they succeed in setting the frame, it works against a lot of the rationalist and rationalist-adjacent community norms I value.

(i find it symptomatic, but not at all surprising that the person who criticised bostrom is not with the movement anymore, but scientific racists and hbd-curious fuckers like tracing… are.)

[-] bcdavid@hachyderm.io 6 points 5 months ago

@mawhrin @dgerard It never ceases to amaze me how anyone can read the word vomit these people fling into the world and think it's good writing.

[-] mawhrin@awful.systems 5 points 5 months ago

it's the type of the very dense cult jargon that you stop noticing only when you're ears-deep into the cult.

[-] dgerard@awful.systems 4 points 5 months ago* (last edited 5 months ago)

I can't work out a search to tell me for sure, but I do believe that's the first link to nu-sneerclub from anywhere on the three rationalist fora

given taboo-crafting power would craft a set of norms emphatically disagreeable to me. I think parts of the EA community have themselves shown some susceptibility to similar impulses, throwing people like Nick Bostrom under the bus to do so.

i'm sure there's a reading of this string of TW conspicuously avoiding saying the specific thing he's talking about that isn't "TW considers the racism a load-bearing feature", and he'll clarify this any time now

[-] gerikson@awful.systems 5 points 5 months ago

I can’t work out a search to tell me for sure, but I do believe that’s the first link to nu-sneerclub from anywhere on the three rationalist fora

Senpais have noticed us!

[-] Evinceo@awful.systems 3 points 5 months ago

Ain't lightcone the ones who funded the effective charity that was a husband, a wife, two employees and a brother in law who fucked an employee, angering the wife? I seem to remember her writing a long tirade about how hot tub meetings and travel photos proved that working conditions at the charity were very good, and there's nothing inappropriate about any of the above.

[-] dgerard@awful.systems 6 points 5 months ago
[-] Evinceo@awful.systems 3 points 5 months ago

Ah, I think I was confused because Ben Pace was investigating nonlinear under the auspices of Lightcone.

[-] Eiim@lemmy.blahaj.zone 22 points 5 months ago

I mean yeah, not exactly new news. Although I have to make a correction:

Manifold is a startup that runs Manifund, a prediction market – a forecasting method that was the ostensible topic of the conference.

Manifold is the name of the prediction market. Manifold the company also runs Manifund, which distributes money to various EA efforts.

Also, "Manifest has no specific views on eugenics or race & IQ" does not give me confidence in Manifest's views on eugenics or race & IQ.

[-] blakestacey@awful.systems 23 points 5 months ago

My "I have no specific views on eugenics" T-shirt has prompted many questions already answered by my T-shirt

[-] gnomicutterance@awful.systems 13 points 5 months ago

Imagine saying “we have no specific views on eugenics”! You should, buddy. You should.

[-] Architeuthis@awful.systems 14 points 5 months ago

"Manifest is open minded about eugenics and securing the existence of our people and a future for high IQ children."

[-] Architeuthis@awful.systems 21 points 5 months ago* (last edited 5 months ago)

Great quote from the article on why prediction markets and scientific racism currently appear to be at one degree of separation:

Daniel HoSang, a professor of American studies at Yale University and a part of the Anti-Eugenics Collective at Yale, said: “The ties between a sector of Silicon Valley investors, effective altruism and a kind of neo-eugenics are subtle but unmistakable. They converge around a belief that nearly everything in society can be reduced to markets and all people can be regarded as bundles of human capital.

[-] sailor_sega_saturn@awful.systems 15 points 5 months ago* (last edited 5 months ago)

One of the videos ManiFest proudly highlights from last year: https://www.youtube.com/watch?v=uvPDbOHSS4M

Genetic Enhancement: Prediction Markets for Future People by Jonathan Anomaly (Professor & Eugenicist)

Urrk, what sort of person proudly calls themself a Eugenicist? What sort of festival highlights this as good behavior??

[-] sailor_sega_saturn@awful.systems 19 points 5 months ago

This guy... ugh I wanted to make it through today without learning about a new weirdo but I ended up watching the above presentation. Some terrible quotes for y'all to sneer at:

[increasing oxytocin] probably makes you more ethnocentric. Is that a moral enhancement? In some ways yes in other ways no.

I'm not saying I'm racist, but I'm like not not saying that either.

The few people [remaining] that are prone to do [theft] can learn that there are reasons not to do that in high-IQ-- um sorry-- in developed countries.

Oopsie just a little slip of the tongue teehee.

Solutions: AI-guided suggestions to parents about the traits they should select.

Oh no

I think what's going to happen is that people are going to cluster more together; potentially in smaller political societies with enough land to be defensible, along traits that they care about.

Ah yes, he did mention he was a libertarian at the start didn't he?


Overall: dude spent way too much time talking about IQ, expressing weird racist ideas without ever saying the word "race", complaining about "wokeism", and day-dreaming about genetic breeding in libertarian citadels.

[-] gnomicutterance@awful.systems 11 points 5 months ago

Solutions: AI-guided suggestions to parents about the traits they should select

There’s a joke in here about and that’s how the human race all became polydactylic with extra elbows, but it’s too early in the morning for me to figure out how to make it not be at the expense of people with limb and facial differences.

[-] 200fifty@awful.systems 11 points 5 months ago

It is always kind of bewildering to me though. Like, has no one ever explained to these people the health problems that highly-bred dogs tend to have? Have they never heard of 'hybrid vigor' or issues with smaller gene pools making populations more susceptible to disease? Were they just asleep during biology 101? I don't get how people who think they're so smart can have failed to consider even the most basic issues with planning to turn humanity into Gros Michel bananas.

[-] Soyweiser@awful.systems 10 points 5 months ago

That is easy, first, pass biology in highschool, then don't think about it for 10 years until somebody on a forum somewhere is talking a lot (no more than you think, even more than that, yes that much) about IQ and genes. This brings back some ideas from science fiction they consumed, mix in contrarian debate bros to taste and boom suddenly they go "well, perhaps eugenics isn't that bad"

[-] rook@awful.systems 10 points 5 months ago

Obviously, your genes are terrible, low quality things that would obviously ruin any group which had them. My genes are superior quality, and if everyone shared them they’d all be irresistibly sexy and overpoweringly rational, just like me.

[-] Soyweiser@awful.systems 7 points 5 months ago

My genes are terrible on purpose. Keeps me on my toes. And it just makes me grind harder. #grindset #mindset #dailyhustle #hustle #hittersnotquitters #risengrind #hustle #lockedin #davidgoggins #motivation #owgodwhydidshedivorceme #discipline #alpha #sigma #tothemoon

[-] gerikson@awful.systems 7 points 5 months ago

Not to mention the descendants of Queen Victoria being afflicted by hemophilia, or the Spanish Habsburgs prioritizing keeping property within the realm instead of gifting it in dowries and leading to terrible inbreeding.

[-] slopjockey@awful.systems 7 points 5 months ago* (last edited 5 months ago)

This guy's vision of the future is Charles Murray's coming apart, except with different groups of Habsburgs enclosed in border walls

[-] sailor_sega_saturn@awful.systems 10 points 5 months ago

He also claimed to not know why he was advertised on the manifest.is website as a conference-goer

If I didn't overlook any, there were two people removed from the "special guest" list between Tuesday and today: this guy and Aaron Silverbook (the anticavity bacteria guy).

So what's the over-under between ManiFest being overly eager in listing special guests and dudes wanting to distance themselves after the fact? I know LessOnline just had a list of people they wished would come (and which ones were actually confirmed in the small print)

[-] blakestacey@awful.systems 7 points 5 months ago
[-] saucerwizard@awful.systems 5 points 5 months ago

Hanson is already weeping about this.

this post was submitted on 16 Jun 2024
80 points (100.0% liked)

SneerClub

983 readers
15 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS