65
top 50 comments
sorted by: hot top controversial new old
[-] BlueMonday1984@awful.systems 33 points 5 months ago

Do you really think "cult" is a useful category/descriptor here?

My view: things identified as "cults" have a bunch of good traits. EA should, where possible, adopt the good traits and reject the bad ones, and ignore whether they're associated with the label "cult" or not.

Yes, this is real

[-] Soyweiser@awful.systems 21 points 5 months ago* (last edited 5 months ago)

Not only is this real, I think this is a paraphrase of a thing Yud wrote. Which makes it even cultier. (A reason why I called the Rationalismsphere a cult incubator, as their teachings make you more susceptible to getting into cults).

Edit: For example on his writing on cults, and more 'Every Cause Wants To Be A Cult'. (Look Cade Metz is referenced, before they turned on him)

[-] zbyte64@awful.systems 14 points 5 months ago* (last edited 5 months ago)

Nobody:

EA: Scientology has some good traits, we should copy them.

[-] dgerard@awful.systems 12 points 5 months ago
[-] froztbyte@awful.systems 12 points 5 months ago

“Look, all I’m saying is that Scientology managed to scale massively, both in members and funding! There’s obviously something there that we can learn from!”

[-] o7___o7@awful.systems 11 points 5 months ago* (last edited 5 months ago)

Fuck me...I mean, say what you want about the tenets of Effective Altruism, Dude. At least it's an ethos!

[-] sonori@beehaw.org 26 points 5 months ago

It took me a few comments to realize that we were taking about Effective Altruists and not Electronic Arts. I read EA was becoming a cult and yelling at wokeism for all its troubles, ya, that sounds about right.

[-] Amoeba_Girl@awful.systems 21 points 5 months ago

blaming your issues on a conspiracy is a great way to ensure your movement doesn't become a cult!

[-] jax@awful.systems 21 points 5 months ago* (last edited 5 months ago)

these people can't stop telling on themselves lmao

There’s currently a loud minority of EAs saying that EA should ostracize people if they associate with people who disagree with them. That we should try to protect EAs from ideas that are not held by the majority of EAs.

how fucking far are their heads up their own collective arses to not understand that you can't have a productive, healthy discourse without drawing a line in the sand?

they spend fucking hundreds of collective hours going around in circles on the EA forum debating^[where "debating" here is continually claiming to be "'open to criticism" while, at the same time, trashing anyone who does provide any form of legitimate criticism, so much so that it seems to be a "norm" for internal criticism to be anonymous for fear of retribution] this shit, instead of actually doing anything useful

how do they, in good conscience, deny any responsibility for the real harms ideas cause, when they continue to lend them legitimacy by entertaining them over and over and over again?

I swear these fuckers have never actually had to fight for or defend something that is actually important, or directly affects the day-to-day lived experience or material conditions of themselves or anyone they care about

I hope we protect EA’s incredible epistemic norms

lol, the norms that make it a-okay to spew batshit stuff like this? fuck off

Also, it’s obvious that this isn’t actually EA cultiness really, but just woke ideology trying to take over EA

[-] YouKnowWhoTheFuckIAM@awful.systems 10 points 5 months ago

they spend fucking hundreds of collective hours going around in circles on the EA forum debating[1] this shit, instead of actually doing anything useful

how do they, in good conscience, deny any responsibility for the real harms ideas cause, when they continue to lend them legitimacy by entertaining them over and over and over again?

Adderall

load more comments (12 replies)
load more comments (1 replies)
[-] mawhrin@awful.systems 20 points 5 months ago

the indignant outrage at the mere suggestion that the cultists be slightly less visibly racist, maybe, is 10/10, no notes.

[-] yuri@pawb.social 20 points 5 months ago

Honestly bonkers to hear “woke” used unironically

[-] dgerard@awful.systems 24 points 5 months ago

and with its original meaning! i.e., awareness of systemic racism

[-] mawhrin@awful.systems 20 points 5 months ago

also, isn't that the “nonlinear” thing co-founder?

[-] scruiser@awful.systems 23 points 5 months ago

Which, to recap for everyone, involved underpaying and manipulating employees into working as full time general purpose servants. Which is pretty up there on the scale of cult-like activity out of everything EA has done. So it makes sense she would be trying to pull a switcheroo as to who is responsible for EA being culty...

[-] dgerard@awful.systems 19 points 5 months ago

full time general purpose servant

and fucktoys on the side, this is EA after all

load more comments (1 replies)
[-] dgerard@awful.systems 14 points 5 months ago

ding ding ding!

[-] sailor_sega_saturn@awful.systems 19 points 5 months ago* (last edited 5 months ago)

what the heck EA forum doesn't have a block feature? That's just... ew.

Also how long have all these people been obsessed with "woke" and "the left"? Because it's been really obvious and over the top lately.

[-] BlueMonday1984@awful.systems 13 points 5 months ago

what the heck EA forum doesn’t have a block feature? That’s just… ew.

You don't need a block feature if you're as insufferable as the average EA /j

load more comments (11 replies)
[-] borari@lemmy.dbzer0.com 17 points 5 months ago

Can there be a rule about acronyms being defined on first usage? I spent way to long trying to figure out how Electronic Arts had some cult infestation in their upper management and how “wokeism” applied, especially since stuff like that has been in the game developer news cycle again. I started getting really confused when I saw some linked post conversation talking about some founder and their polyamorous relationship.

I’ve figured out that we’re talking about effective altruism, but at this point I’ve wasted my entire pre-bed shit time on people I couldn’t care less about.

[-] froztbyte@awful.systems 26 points 5 months ago

Unfortunately in this case the problem is you (as a non-frequenter of this sub (which is explicitly about dunking on these fools)) coming in with no context, although I’d agree with you in principle otherwise

Also it sounds like that game-EA thing could do with a sneer on techtakes

[-] froztbyte@awful.systems 14 points 5 months ago

(idly, didn't mean that aggressively. wrote it pre-coffee, but, yeah)

[-] blakestacey@awful.systems 11 points 5 months ago

I'm trying to think of a polite way to say "in short, no" and "the linked tweet having "effectivealtruism" in it twice should have been a clue", because I'm not that mean, but I probably need more coffee too.

load more comments (2 replies)
[-] o7___o7@awful.systems 14 points 5 months ago* (last edited 5 months ago)

What killed lurking before posting, and can we blame WOKE? (just kidding!)

I do wonder if seamless federation can be too seamless, since it clearly makes it easier for people to get a bit lost and wander into niche forums unintentionally.

[-] earthquake@lemm.ee 12 points 5 months ago

Absolutely magnificent that this guy took a break from posting about OLED WLAN DNS SKU TCL and 800 ISO without bokah at like f/16 to come in here and chastise us for using in-group terms.

[-] self@awful.systems 12 points 5 months ago

800 ISO without bokah at like f/16

on film, handheld, at night. for when you love grainy images and need every part of the frame to be blurry, but specifically not the kind of blur that looks good

load more comments (1 replies)
load more comments (1 replies)
load more comments (12 replies)
load more comments (1 replies)
[-] Soyweiser@awful.systems 17 points 5 months ago* (last edited 5 months ago)

The People Who Pray At Prompts are suddenly very worried about cults.

E: haha spoke too soon

from the replies:

Do you really think "cult" is a useful category/descriptor here?

My view: things identified as "cults" have a bunch of good traits. EA should, where possible, adopt the good traits and reject the bad ones, and ignore whether they're associated with the label "cult" or not.

Wonder what went wrong there, did they just see the word EA and Cult and went 'people are calling Rationalism a cult again, time to deploy the Rationalist answer. A bot? Something else? (More edit, ah prob somethign else as this is prob a reaction to the whole line of tweets and not that specific tweet, a line of tweets which are doing the geek social fallacies there is a little bit more to being a cult than just ostracizing people)

[-] o7___o7@awful.systems 13 points 5 months ago* (last edited 5 months ago)

I've been appreciating the term "high-control group" as an alternative, with reference to the BITE [1] model of thought control. People trapped in the group housing situation easily check all of those boxes.

[1] https://freedomofmind.com/cult-mind-control/bite-model-pdf-download/ (note: Steven Hassan has done good work, but can be a little too profit-seeking himself imho)

[-] dgerard@awful.systems 10 points 5 months ago

good thing Nonlinear has never operated as high-control,

load more comments (1 replies)
[-] YouKnowWhoTheFuckIAM@awful.systems 17 points 5 months ago

For a moment there I wanted to say, “ok hold on for a minute: you think EA doesn’t create cult-like behaviour, only woke creates cult-like behaviour, but even if I grant all that about woke, surely EVERY charitable enterprise in modern history has tended towards cult behaviour?”

“So what do you think makes EA so goddamn special?”

Then I realised it’s the “incredible epistemic norms” of EA, i.e. the strongest drivers of cult-like behaviour going almost worldwide at the moment, which are the primary bulwark against EA behaving like a cult

[-] dgerard@awful.systems 16 points 5 months ago
[-] BigMuffin69@awful.systems 17 points 5 months ago

Ugh, I feel like I just gazed into the abyss on this one 🤮 . Also love (fucking hate) how the only output from these EA charities is galactic scale fraud and abuse of some poor volunteers. Just the other day I randomly stumbled upon her musing about chat bot suffering without knowing who she was. If only she would give the same consideration to her employees.

[-] dgerard@awful.systems 15 points 5 months ago

chat bot suffering

fuckin

[-] skillissuer@discuss.tchncs.de 16 points 5 months ago

of all fucking people, the audacity

[-] o7___o7@awful.systems 13 points 5 months ago

oh god. these people are exhausting.

[-] dgerard@awful.systems 15 points 5 months ago

it's about the curse of ___ism

[-] Soyweiser@awful.systems 11 points 5 months ago

One objection is that “woke ideology hurts EA cause areas”.

There are many counters to that.

First off, are they actually “woke”? There is a ton of disagreement.

[-] dgerard@awful.systems 18 points 5 months ago

"kicking the racists out might mean less rich racists giving me money for Nonlinear"

[-] froztbyte@awful.systems 17 points 5 months ago

"I just think it's really unfair for my friends and donors to have to feel this uncomfortable because ..."

aaaah, the language of the complicit and the vile. how I loathe thee.

load more comments (2 replies)
load more comments (1 replies)
[-] aio@awful.systems 13 points 5 months ago

why tf can't she say the word "racist"? like is it supposed to be a dogwhistle insinuating similarity between "racist" and "leftist"??

[-] Amoeba_Girl@awful.systems 17 points 5 months ago

it's a dirty political word that only intellectual terrorists use. not only is it beneath her to acknowledge social facts, she wants to make it clear she won't be intimidated into considering it.

[-] Soyweiser@awful.systems 16 points 5 months ago* (last edited 5 months ago)

Because the community has both racists, sexists and facists. None of whom are real nor should they be avoided or shunned!

[-] froztbyte@awful.systems 11 points 5 months ago* (last edited 5 months ago)

they must be just great at conga crossbars, with that little spine

load more comments
view more: next ›
this post was submitted on 20 Jun 2024
65 points (100.0% liked)

SneerClub

983 readers
15 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS