[-] CinnasVerses@awful.systems 8 points 3 weeks ago

In November 2024, Habryka also said " we purchased a $16.5M hotel property, renovated it for approximately $6M and opened it up ... under the name Lighthaven." So the disconnect between what Lightcone says to the taxman (we are small bois, CFAR owns the real estate) and what it says to believers (we own the real estate) was already there.

[-] CinnasVerses@awful.systems 9 points 3 weeks ago

Plus $12,018 in "other reportable compensation." You could support a team of ten graduate students with that money and they would actually make things other than fanfic and publish research!

31
submitted 3 weeks ago* (last edited 2 weeks ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

Its almost the end of the year so most US nonprofits which want to remain nonprofits have filed Form 990 for 2024 including some run by our dear friends. This is a mandatory financial report.

  • Lightcone Infrastructure is here. They operate LessWrong and the Lighthaven campus in Berkeley but list no physical assets; someone on Reddit says that they let fellow travelers like Scott Alexander use their old rented office for free. "We are a registered 501(c)3 and are IMO the best bet you have for converting money into good futures for humanity." They also published a book and website with common-sense, data-based advice for Democratic Party leaders called Deciding to Win which I am sure fills a gap in the literature. Edit: their November 2024 call for donationswhich talks how they spend $16.5m on real estate and $6m on renovations then saw donations collapse is here, an analysis is here
  • CFAR is here. They seem to own the campus in Berkeley but it is encumbered with a mortgage ("Land, buildings, and equipment ... less depreciation; $22,026,042 ... Secured mortgages and notes payable, $20,848,988"). I don't know what else they do since they stopped teaching rationality workshops in 2016 or so and pivoted to worrying about building Colossus. They have nine employees with salaries from $112k to $340k plus a president paid $23k/year
  • MIRI is here. They pay Yud ($599,970 in 2024!) and after failing to publish much research on how to build Friend Computer they pivoted to arguing that Friend Computer might not be our friend. Edit: they had about $16 million in mostly financial assets (cash, investments, etc.) at end of year but spent $6.5m against $1.5m of revenue in 2024. They received $25 million in 2021 and ever since they have been consuming those funds rather than investing them and living off the interest.
  • BEMC Foundation is here. This husband-and-wife organization gives about $2 million/year each to Vox Future Perfect and GiveWell from an initial $38m in capital (so they can keep giving for decades without adding more capital). Edit: The size of the donations to Future Perfect and GiveWell swing from year to year so neither can count on the money, and they gave out $6.4m in 2024 which is not sustainable.
  • The Clear Fund (GiveWell) is here. They have the biggest wad of cash and the highest cashflow.
  • Edit: Open Philanthropy (now Coefficient Giving) is here (they have two sister organizations). David Gerard says they are mainly a way for Dustin Moskevitz the co-founder of Facebook to organize donations, like the Gates, Carnegie, and Rockefeller foundations. They used to fund Lightcone.
  • Edit: Animal Charity Evaluators is here. They have funded Vox Future Perfect (in 2020-2021) and the longtermist kind of animal welfare ("if humans eating pigs is bad, isn't whales eating krill worse?")
  • Edit: Survival and Flourishing Fund does not seem to be a charity. Whereas a Lightcone staffer says that SFF funds Lightcone, SFF say that they just connect applicants to donors and evaluate grant applications. So who exactly is providing the money? Sometimes its Jaan Tallinn of Skype and Kazaa.
  • Centre for Effective Altruism is mostly British but has a US wing since March 2025 https://projects.propublica.org/nonprofits/organizations/333737390
  • Edit: Giving What We Can seems like a mainstream "bednets and deworming pills" type of charity
  • Edit: Givedirectly Inc is an excellent idea in principle (give money to poor people overseas and let them figure out how best to use it) but their auditor flagged them for Material noncompliance and Material weakness in internal controls. The mistakes don't seem sinister (they classified $39 million of donations as conditional rather than unconditional- ie. with more restrictions than they actually had). GiveDirectly, Give What We Can, and GiveWell are all much better funded than the core LessWrong organizations.

Since CFAR seem to own Lighthaven, its curious that Lightcone head Oliver Habryka threatens to sell it if Lightcone shut down. One might almost imagine that boundaries between all these organizations are not as clear as the org charts make it seem. SFGate says that it cost $16.5 million plus renovations:

Who are these owners? The property belongs to a limited liability company called Lightcone Rose Garden, which appears to be a stand-in for the nonprofit Center for Applied Rationality and its project, Lightcone Infrastructure. Both of these organizations list the address, 2740 Telegraph Ave., as their home on public filings. They’ve renovated the inn, named it Lighthaven, and now use it to host events, often related to the organizations’ work in cognitive science, artificial intelligence safety and “longtermism.”

Habryka was boasting about the campus in 2024 and said that Lightcone budgeted $6.25 million on renovating the campus that year. It also seems odd for a nonprofit to spend money renovating a property that belongs to another nonprofit.

On LessWrong Habryka also mentions "a property we (Lightcone) own right next to Lighthaven, which is worth around $1M" and which they could use as collateral for a loan. Lightcone's 2024 paperwork listed the only assets as cash and accounts receivable. So either they are passing around assets like the last plastic cup at a frat party, or they bought this recently while the dispute with the trustees was ongoing, or Habryka does not know what his organization actually owns.

The California end seems to be burning money, as many movements with apocalyptic messages and inexperienced managers do. Revenue was significantly less than expenses and assets of CFAR are close to liabilities. CFAR/Lightcone do not have the $4.9 million liquid assets which the FTX trustees want back and claim their escrow company lost another $1 million of FTX's money.

[-] CinnasVerses@awful.systems 11 points 1 month ago

Chapman's advice seems pretty good for keeping an indy art scene small and for autistic introverts not big and for normies, but not for realizing that LessWrong and EA are cults founded by bad people with bad goals with an exoteric doctrine out front and an esotetric doctrine once you are committed.

21
submitted 1 month ago* (last edited 1 month ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

People connected to LessWrong and the Bay Area surveillance industry often cite David Chapman's "Geeks, Mops, and Sociopaths in Subculture Evolution" to understand why their subcultures keep getting taken over by jerks. Chapman is a Buddhist mystic who seems rationalist-curious. Some people use the term postrationalist.

Have you noticed that Chapman presents the founders of nerdy subcultures as innocent nerds being pushed around by the mean suits? But today we know that the founders of Longtermism and LessWrong all had ulterior motives: Scott Alexander and Nick Bostrom were into race pseudoscience, and Yudkowsky had his kinks (and was also into eugenics and Libertarianism). HPMOR teaches that intelligence is the measure of human worth, and the use of intelligence is to manipulate people. Mollie Gleiberman makes a strong argument that "bednet" effective altruism with short-term measurable goals was always meant as an outer doctrine to prepare people to hear the inner doctrine about how building God and expanding across the Universe would be the most effective altruism of all. And there were all the issues within LessWrong and Effective Altruism around substance use, abuse of underpaid employees, and bosses who felt entitled to hit on subordinates. A '60s rocker might have been cheated by his record label, but that does not get him off the hook for crashing a car while high on nose candy and deep inside a groupie.

I don't know whether Chapman was naive or creating a smokescreen. Had he ever met the thinkers he admired in person?

[-] CinnasVerses@awful.systems 10 points 1 month ago

The Optimist: Sam Altman, OpenAI, and the Race to Invent the Future by Keach Hagey has potential https://archive.is/22O9z

Two members of the Extropian community, internet entrepreneurs Brian and Sabine Atkins—­who met on an Extropian mailing list in 1998 and were married soon after—­were so taken by this message that in 2000 they bankrolled a think tank for Yudkowsky, the Singularity Institute for Artificial Intelligence. At 21, Yudkowsky moved to Atlanta and began drawing a nonprofit salary of around $20,000 a year to preach his message of benevolent superintelligence. “I thought very smart things would automatically be good,” he said. Within eight months, however, he began to realize that he was wrong—­way wrong. AI, he decided, could be a catastrophe.

This excerpt on Wired slams down names and dates and social connections without getting distracted by all the things that are wrong with what it describes.

13

Form 990 for these organizations mentions many names I am not familiar with such as Tyler Emerson. Many people in these spaces have romantic or housing partnerships with each other, and many attend meetups and cons together. A MIRI staffer claims that Peter Thiel funded them from 2005 to 2009, we now know when Jeffrey Epstein donated. Publishing such a thing is not very nice since these are living persons frequently accused of questionable behavior which never goes to court (and some may have left the movement), but does a concise list of dates, places, and known connections exist?

Maybe that social graph would be more of a dot. So many of these people date each other and serve on each other's boards and live in the SF Bay Area, Austin TX, the NYC area, or Oxford, England. On the enshittified site people talk about their Twitter and Tumblr connections.

[-] CinnasVerses@awful.systems 10 points 1 month ago

I don't think Torres or Gebru have dismissed the consequences of global heating like Nick Bostrom and Kelsey Piper, or casually called for bombing China and letting something replace humanity like Yud https://www.realtimetechpocalypse.com/p/eliezer-yudkowskys-long-history-of

[-] CinnasVerses@awful.systems 13 points 2 months ago* (last edited 2 months ago)

If I was the leader of a community which had to expel someone for plying people with narcotics, having sex with them underage, and pushing them into extreme BDSM scenarios, I would simply not post that each of those acts is OK sometimes and its not my business to investigate them.

Bloomberg names the person I am thinking of and mentions the first and third accusations

7
Stephen and Steven (awful.systems)

We often mix up two bloggers named Scott. One of Jeffrey Epstein's victims says that she was abused by a white-haired psychology professor or Harvard professor named Stephen. In 2020, Vice observed that two Harvard faculty members with known ties to Epstein fit that description (a Steven and a Stephen). The older of the two taught the younger. The younger denies that he met or had sex with the victim. What kind of workplace has two people who can be reasonably suspected of an act like that?

I am being very careful about talking about this.

6

An opposition between altruism and selfishness seems important to Yud. 23-year-old Yud said "I was pretty much entirely altruistic in terms of raw motivations" and his Pathfinder fic has a whole theology of selfishness. His protagonists have a deep longing to be world-historical figures and be admired by the world. Dreams of controlling and manipulating people to get what you want are woven into his community like mould spores in a condemned building.

Has anyone unpicked this? Is talking about selfishness and altrusm common in LessWrong like pretending to use Bayesian statistics?

[-] CinnasVerses@awful.systems 7 points 4 months ago* (last edited 4 months ago)

unethical

His post on a woman in the EA/LW world who took her own life after saying she had been sexually harassed is https://archive.is/I85mC and there are discussions on Old! SneerClub here and there. I am not comfortable going into this without training in how to talk about self-harm and first-hand knowledge but yeesh.

The Tumblr he cites belongs to Kelsey Piper, a self-identified journalist and meatspace friend who receives donations from people and orgs in the Effective Altruism world and keeps reporting on how EA ideas and people are great.

15
submitted 4 months ago* (last edited 4 months ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

I used to think that psychiatry-blogging was Scott Alexander's most useful/least harmful writing, because its his profession and an underserved topic. But he has his agenda to preach race pseudoscience and 1920s-type eugenics, and he has written in some ethical grey areas like stating a named friend's diagnosis and desired course of treatment. He is in a community where many people tell themselves that their substance use is medicinal and want proscriptions. Someone on SneerClub thinks he mixed up psychosis and schizophrenia in a recent post.

If you are in a registered profession like psychiatry, it can be dangerous to casually comment on your colleagues. Regardless, has anyone with relevant qualifications ever commented on his psychiatry blogging and whether it is a good representation of the state of knowledge?

[-] CinnasVerses@awful.systems 11 points 4 months ago

HPMOR chapter 88 from 2010 has the line

Harry’s brain flagged this as I’m talking to NPCs again and he spun on his heel and dashed back for the broomstick.

Someone who thinks like that will lose in the long run, but they can do a hell of a lot of damage in the short run.

31
submitted 4 months ago* (last edited 4 months ago) by CinnasVerses@awful.systems to c/sneerclub@awful.systems

Bad people who spend too long on social media call normies NPCs as in video-game NPCs who follow a closed behavioural loop. Wikipedia says this slur was popular with the Twitter far right in October 2018. Two years before that, Maciej Ceglowski warned:

I've even seen people in the so-called rationalist community refer to people who they don't think are effective as ‘Non Player Characters’, or NPCs, a term borrowed from video games. This is a horrible way to look at the world.

Sometime in 2016, an anonymous coward on 4Chan wrote:

I have a theory that there are only a fixed quantity of souls on planet Earth that cycle continuously through reincarnation. However, since the human growth rate is so severe, the soulless extra walking flesh piles around us are NPC’s (sic), or ultimate normalfags, who autonomously follow group think and social trends in order to appear convincingly human.

Kotaku says that this post was rediscovered by the far right in 2018.

Scott Alexander's novel Unsong has an angel tell a human character that there was a shortage of divine light for creating souls so "I THOUGHT I WOULD SOLVE THE MORAL CRISIS AND THE RESOURCE ALLOCATION PROBLEM SIMULTANEOUSLY BY REMOVING THE SOULS FROM PEOPLE IN NORTHEAST AFRICA SO THEY STOPPED HAVING CONSCIOUS EXPERIENCES." He posted that chapter in August 2016 (unsongbook.com). Was he reading or posting on 4chan?

Did any posts on LessWrong use this insult before August 2016?

Edit: In HPMOR by Eliezer Yudkowsky (written in 2009 and 2010), rationalist Harry Potter calls people who don't do what he tells them NPCs. I don't think Yud's Harry says they have no souls but he has contempt for them.

[-] CinnasVerses@awful.systems 7 points 4 months ago

In his early blog posts, Scott Alexander talked about how he was not leaping through higher education in a single bound (he went overseas for medical school, and failed to get medical residency on his first try, ending up in a small Midwestern city). So I wonder why he is sure that in a world with fewer university degrees, he would have gotten as far as he did (medical schools in the USA used to limit admissions from people of his ethnicity).

Likewise with immigration restrictions: he knows that they often blocked Jews, many Europeans. and East Asians not just brown people right?

[-] CinnasVerses@awful.systems 15 points 4 months ago

I know that on the American right, every accusation is a confession, but I never thought I would read a scheming cartoon villain accusing his enemies of being the Antichrist! He is even queer-coded, would do great on TV in the 1990s.

[-] CinnasVerses@awful.systems 9 points 4 months ago* (last edited 4 months ago)

Hossenfelder seemed like a normal science blogger and critic of string theory until some recent videos, and most people don't update their blogroll every year. And Woit links her sensible (but defunct) blog, not her out-there videos.

A lot of people in this world have connections going back 15 or 30 years but ended up on opposite sides (eg. Charlie Stross and Curtis Yarvin, or Laurie Penny and Scott Aaronson)

[-] CinnasVerses@awful.systems 10 points 4 months ago* (last edited 4 months ago)

Brunig's argument that Piper is ignorant of the founding arguments for the welfare state and just knows a neoliberal argument for something kind of like a welfare state reminds me of an exchange with someone of her class where I tried a basic Green argument and they fell into it like I was the first guy trying a Judo throw on an American in 1940something. They flailed wildly as if they had never encountered that move and did not have a response ready.

view more: next ›

CinnasVerses

joined 5 months ago