[-] titotal@awful.systems 14 points 7 months ago

I'm sure they could have found someone in the EA ecoystem to throw them money if it weren't for the fundraising freeze. This seems like a case of Oxford killing the institute deliberately. The 2020 freeze predates the Bostrom email, this guy who was consulted by oxford said there was a dysfunctional relationship for many years.

It's not like oxford is hurting for money, they probably just decided FHI was too much of a pain to work with and hurt the oxford brand.

[-] titotal@awful.systems 13 points 7 months ago

The committed Rationalists often point out the flaws in science as currently practiced: the p-hacking, the financial incentives, etc. Feeding them more data about where science goes awry will only make them more smug.

The real problem with the Rationalists is that they* think they can do better*, that knowing a few cognitive fallacies and logicaltricks will make you better than the doctors at medicine, better than the quantum physicists at quantum physics, etc.

We need to explain that yes, science has it's flaws, but it still shits all over pseudobayesianism.

[-] titotal@awful.systems 14 points 8 months ago

To be honest, I'm just kinda annoyed that he ended on the story about his mate Aaron who went on surfing trips to indonesia and gave money to his new poor village friends. The author says aaron is "accountable" to the village, but that's not true, because Aaron is a comparatively rich first world academic that can go home at any time. Is Aaron "shifting power" to the village? No, because they if they don't treat him well, he'll stop coming to the village and stop funding their water supply upgrades. And he personally benefits with praise and friendship from his purchases.

I'm sure Aaron is a fine guy, and I'm not saying he shouldn't give money to his village mates, but this is not a good model for philanthropy! I would argue that a software developer who just donates a bunch of money unconditionally to the village (via givedirectly or something) is arguably more noble than Aaron here, donating without any personal benefit or feel good surfer energy.

[-] titotal@awful.systems 13 points 8 months ago

I enjoyed the takedowns (wow, this guy really hates Macaskill), but the overall conclusions of the article seem a bit lost. If malaria nets are like a medicine with side-effects, then the solution is not to throw away the medicine. (Giving away free nets to people probably does not have a signficant death toll!). At the end they seem to suggest, like, voluntourism as the preferred alternative? I don't think Africa needs to be flooded with dorky software engineers personally going to villages to "help out".

[-] titotal@awful.systems 11 points 9 months ago

years later was shown to be correct

Take a guess at what prompted this statement.

Did one side of the conflict confess? Did major expert organization change their minds? Did new, conclusive evidence arise that was unseen for years?

Lol no. The "confirmation" is that a bunch of random people did their own analysis of existing evidence and decided that it was the rebels based on a vague estimate of rocket trajectories. I have no idea who these people are, although I think the lead author is this guy currently stanning for Russia's war on ukraine?

[-] titotal@awful.systems 14 points 9 months ago

The sole funder is the founder, Saar Wilf. The whole thing seems like a vanity project for him and friends he hired to give their opinion on random controversial topics.

[-] titotal@awful.systems 11 points 9 months ago

The video and slides can be found here, I watched a bit of it as it happened and it was pretty clear that rootclaim got destroyed.

Anyone actually trying to be "bayesian" should have updated their opinion by multiple orders of magnitude as soon as it was fully confirmed that the wet market was the first superspreader event. Like, at what point does occams razor not kick in here?

[-] titotal@awful.systems 14 points 11 months ago

Thanks! I strive for accuracy, clarity, humility, and good faith. Aka, everything I learned not to do from reading the sequences.

[-] titotal@awful.systems 12 points 11 months ago* (last edited 11 months ago)

EA as a movement was a combination of a few different groups (This account says Giving what we can/80000 hours, Givewell, and yudkowsky's MIRI). However, the main source of early influx of people was the rationalist movement, as Yud had heavily promoted EA-style ideas in the sequences.

So if you look at surveys, right now a a relatively small percentage (like 15%) of EA's first heard about it through lesswrong or SSC. But back in 2014, and earlier, Lesswrong was the number one onroad into the movement (like 30%) . (I'm sure a bunch of the other answers may have heard about it from rationalist friends as well). I think it would have been even more if you go back earlier.

Nowadays, most of the recruiting is independent from the rationalists, so you have a bunch of people coming in and being like, what's with all the weird shit? However they still adopt a ton of rationalist ideas and language, and the EA forum is run by the same people as Lesswrong. It leads to some tension: someone wrote a post saying that "yudkowsky is frequently confidently, egregiousl wrong", and it was somewhat upvoted on EA forum but massively downvoted on Lesswrong.

[-] titotal@awful.systems 11 points 1 year ago

If you want more of this, I wrote a full critique of his mangled intro to quantum physics, where he forgets the whole "conservation of energy" thing.

[-] titotal@awful.systems 12 points 1 year ago

My impression is that the toxicity within EA is mainly concentrated in the bay area rationalists, and in a few of the actual EA organizations. If it's just a local meetup group, it's probably just going to be some regular-ish people with some mistaken beliefs that are genuinely concerned about AI.

Just be polite and present arguments, and you might actually change minds, at least among those who haven't been sucked too far into Rationalism.

[-] titotal@awful.systems 9 points 1 year ago* (last edited 1 year ago)

Current physicist here: yeah, most physicists are in the "shut up and calculate" camp, and view the interpretations as fun lunchroom conversation.

I also think that collapse is unsatisfying, and I think yud did an adequate job in relaying the reasons why a lot of physicists are unhappy with it. The problem is that "collapse is unsatisfying" is not sufficient evidence to declare that MWI is true and that MWI nonbelievers are fools. The obvious point being that there a shitload of other interpretations which neither feature many-worlds or "real" collapse. The other point is that MWI is an incomplete theory, as there are no explanation for the Born probabilities. Also, we know we don't have the full picture of quantum physics anyway (as it's incompatible with general relativity), so it's possible that if we figure out a unified theory the problems with interpretations will go away.

view more: ‹ prev next ›

titotal

joined 1 year ago