4
all 42 comments
sorted by: hot top controversial new old
[-] corbin@awful.systems 7 points 1 year ago

At risk of going NSFW, it's obvious that none of these folks have read Singer 1971, which is the paper that kickstarted the EA movement. This paper's argument has a massive fucking hole right in the middle.

Without cracking open the paper, I seem to recall that it is specifically about Oxfam and famine in Africa. The central claim of the paper is that everybody should donate to Oxfam. However, if one is an employee of Oxfam, then suddenly the utilitarian arithmetic fails; his argument only allows for money going from non-Oxfam taxpayers to Oxfam employees.

Can't help but notice how the main problem with EA charities is the fucking nepotism. Almost as if the EA movement rests on a philosophical foundation of ignoring when charities employ friends of donors.

[-] YouKnowWhoTheFuckIAM@awful.systems 2 points 1 year ago* (last edited 1 year ago)

I don’t see how this works.

On one point:

The utilitarian argument construes the relevant ethical concerns, unsurprisingly, as utilitarian: the starting point doesn’t matter so long as the right results get over the line. This can be both one of utilitarianism’s greatest strengths and greatest weaknesses, and in this case the strength is that utilitarianism is highly accommodating of the fact that some but not all people are employees of Oxfam (or indeed any relevant charity or similar organisation). The obvious point to make is that If you’re not an employee of Oxfam then the utilitarian argument goes through, because giving to Oxfam is your means of getting those results over the line. If you are an employee of Oxfam, then perhaps you don’t need to give, because working for Oxfam is your means.

On another:

The sentence “his argument only allows for money going from non-Oxfam taxpayers to Oxfam employees” doesn’t include the important premise “the role of an Oxfam employee is to convert that money into good deeds done for the poor, for example by using it to pay for food in a famine”. The intended result is the same whether you are an employee of Oxfam or not (viz. paying for food in a famine). You want us to quibble about the wording (or rather: the wording as you have summarised it here) on grounds (which you leave implicit, so correct me if I’m wrong) that it is incoherent to say “everybody” when some people are already employees of Oxfam.

This seems to drastically confuse Singer’s actual aim (to convince the vast majority of people who are not Oxfam employees to give to Oxfam) for something not only very odd but plainly non-utilitarian, something like: “it is a deontological requirement that everybody give money to Oxfam”.

[-] corbin@awful.systems 4 points 1 year ago

I was incorrect; the paper is about famine and aid in Bengal.

NSFWHere is a PDF of Singer's paper. On p4 you can see the closest he gets to actually doing arithmetic. At that point he does not notice the problem I pointed out; he only notes that we can contribute labor instead of money, without considering that money is what compensates laborers. On p7 he admits that utilitarianism does not give a complete analysis, because it cannot predict a time when charity will no longer be necessary; however, he does not note that many charities are set up to provide eternal grift, including some of the biggest humanitarian-aid charities in the world.

Bonus sneer! Quote from Singer's paper (p9):

Another, more serious reason for not giving to famine relief funds is that until there is effective population control, relieving famine merely postpones starvation. … The conclusion that should be drawn is that the best means of preventing famine, in the long run, is population control. It would then follow from the position reached earlier that one ought to be doing all one can to promote population control (unless one held that all forms of population control were wrong in themselves, or would have significantly bad consequences). Since there are organizations working specifically for population control, one would then support them rather than more orthodox methods of preventing famine.

Isn't Singer so polite to leave us an escape hatch just in case we happen to "[hold] that all forms of population control [are] wrong in themselves"? But we have enough experience to know now that sterilization (USA), rules against too many children (CCP), and straight-up forced starvation (USSR) are inhumane. So while his ignorance could be acceptable in the 70s, I think that our half-century of intervening experience shows that he was, uh, naïve.

[-] jonhendry@awful.systems 6 points 1 year ago

I suspect that was simply Singer's nod to religious opposition to voluntary contraception and he wasn't necessarily suggesting that the things you list are viable options.

I don’t really get the sneer here, he mentions population control at a time when it was widely believed that overpopulation was a looming problem

[-] dgerard@awful.systems 5 points 1 year ago

that the eugenics was mainstream doesn't make it not eugenics tho, or mean that it wasn't bad then too

No, but Singer does mean stuff like “supply birth control to people who don’t have birth control” and “make them rich and educated so they have fewer kids” which eugenics or not is a real policy response by governments which had to deal with famine pursued

[-] mawhrin@awful.systems 2 points 1 year ago

singer's a fucking eugenicist though (as any disabled person would tell you).

Right, so go after him for that!

[-] mawhrin@awful.systems 2 points 1 year ago

i mostly keep noting that he's not a serious person and should not be treated as such.

[-] froztbyte@awful.systems 2 points 1 year ago

there was a while I was working at an org that would occasionally do things with the wikimedia foundation

for similar reasons as what you remark on here: when the walesbegging banners would pop up on wikipedia, I'd only chuckle and move on

[-] self@awful.systems 3 points 1 year ago

a quick supercut of garbage Scott Alexander takes for your sneering convenience:

I bet the billionaires who have donated the majority of their fortune to the cause also enjoy being told it's just so they can get a few percent tax break.

does anyone have the slightest idea what Scott’s talking about here? cause I feel like I’d have heard of a billionaire donating most of their money to any cause, unless my intuition’s right and scott’s playing with words pretending that treating a non-profit as a piggy bank somehow isn’t possible or common

I am of the opinion that most billionaires are good and have nothing to whitewash. But even aside from that, this is the worst possible way to whitewash money. If you spend your money on a yacht, people say "cool yacht".

I… I don’t have words for this one actually, the sneers write themselves

"Doesn't actually do selfless things with the money" - can you explain how this works? My impression is if your foundation spends money on yachts or mansions for yourself, that's plain tax fraud.

a bunch of folks in the thread and the original podcast have already explained exactly how this works, Scott, and it’s also kinda fucking obvious, but I guess don’t let that stop you from JAQing off about it

[-] naevaTheRat@lemmy.dbzer0.com 2 points 1 year ago

For a subreddit of supposedly ultra rational, dispassionate intellectuals that are willing to consider any ideas for merit based on pure argument and ignoring the speaker they sure do spend a lot of words talking about the kind of man Robert is vs what he actually says in the ep.

[-] gerikson@awful.systems 2 points 1 year ago* (last edited 1 year ago)

My rational analysis — Your emotional reaction

[-] dgerard@awful.systems 2 points 1 year ago

I have priors
You have biases
She is toxoplasmotic SJW filth

[-] sharedburdens@hexbear.net 2 points 1 year ago

She is toxoplasmotic SJW filth

If you say this in front of a mirror 3 times I show up

spoilergarf-chan

[-] swlabr@awful.systems 2 points 1 year ago

I constantly experience [the Gell-Mann amnesia] effect on this subreddit; everyone sounds so smart and so knowledgeable until they start talking about the handful of things I know a little bit about (leftism, the arts, philosophy) and they’re so far off the mark — then there’s another post and I’ve forgotten all about it

Bias noted, impact not reduced. Basic rationality failed. These people are so willing to discard their own sense of right and wrong, moral or rational, just to belong in their weird cult. Why is it so hard for these dorks to admit that they don't actually care about being smart or rational and that they just want a bunch of other dorks to be friends with?

[-] lobotomy42@awful.systems 7 points 1 year ago

Why is it so hard for these dorks to admit that they don’t actually care about being smart or rational and that they just want a bunch of other dorks to be friends with?

(Not to be trivially reductionist, but...) Because admitting you want friends is unmasculine. IBTP

[-] AllNewTypeFace@leminal.space 6 points 1 year ago

Why is it so hard for these dorks to admit that they don't actually care about being smart or rational and that they just want a bunch of other dorks to be friends with?

Do they just want some likemindedly dorky friends, or a sense of superiority over the normies/NPCs/whatever they call everyone else, complete with extensively footnoted justifications for why they are the cognitive aristocracy?

[-] froztbyte@awful.systems 4 points 1 year ago

Why is it so hard for these dorks to admit that they don’t actually care about being smart or rational and that they just want a bunch of other dorks to be friends with?

because that'd involve talking about feelings, and those are the domain of weird squishy gooey human things, disgusting! :sarcmark:

[-] fasterandworse@awful.systems 1 points 1 year ago

then there’s another post and I’ve forgotten all about it

was so sensible until this twisty tail

[-] swlabr@awful.systems 2 points 1 year ago

Are there crazy people adjacent to the community? Of course, and there are certainly loud billionaires co-opting it to their own purposes, and even some of the people from the beginning have failed to live up to their aspirations.

99% of EA funding went into building a lampshade to hang over this minor quibble.

[-] self@awful.systems 1 points 1 year ago

But dismissing the majority of the community is like saying the entirety of Antifa

drink!

it’s actually amazing to me how many folks in that thread say they love behind the bastards but don’t appear to have learned anything from it. it’s all “oh man I thought Robert Evans would love my particular techfash grift run by billionaires” which, like, what did they think the point of the podcast was this whole time?

[-] The_Walkening@hexbear.net 1 points 1 year ago

They say that our whole project is the baby killing machine, but it's not. Do I have other examples? No.

[-] dgerard@awful.systems 1 points 1 year ago

Look you toxoplasmotic SJW filth, Scott donates money personally. You must just hate helping people.

[-] dgerard@awful.systems 1 points 1 year ago

I haven't listened to this podcast, but

[-] gerikson@awful.systems 0 points 1 year ago

Not all Effective Altruists! But! all Antifa.

[-] dgerard@awful.systems 1 points 1 year ago

Antifa just hate helping others u kno

this post was submitted on 20 Aug 2023
4 points (100.0% liked)

SneerClub

1003 readers
1 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS