25
all 43 comments
sorted by: hot top controversial new old
[-] naevaTheRat@lemmy.dbzer0.com 14 points 1 year ago

Poor dude really seems to be confused by the hostility he's been received with. Rationalists aren't anywhere near as interested in being right as feeling right. He won't make much progress that way

One might say

he should update his priors

[-] TerribleMachines@awful.systems 12 points 1 year ago

At the risk of being NSFW.

When I met Yud some years ago, I asked him how he goes about learning new things, his answer was roughly: "Scroll on Facebook until I find someone who has written about it." Maybe he actually read some of the sources he references a long time ago but I think he gave up on learning new things and has sat comfortably abusing his power over the community.

Egads these people are gross.

[-] self@awful.systems 11 points 1 year ago

ah, this is the same storied research mechanism my mom used to get into David Icke and telephone psychics

[-] jonhendry@awful.systems 7 points 1 year ago

Good lord I can't imagine learning about things on Facebook.

[-] 200fifty@awful.systems 10 points 1 year ago* (last edited 1 year ago)

There’s something infuriating about this. Making basic errors that show you don’t have the faintest grasp on what people are arguing about, and then acting like the people who take the time to get Ph.Ds and don’t end up agreeing with your half-baked arguments are just too stupid to be worth listening to is outrageous.

Hey, that's what we've been saying for years!

[-] corbin@awful.systems 9 points 1 year ago

The Coco Chanel meme is quite funny, given that the writer seems too young to know much about her other than that she's some sort of fashion lady. (There's a Behind the Bastards series on her upbringing, business attitude, and collaboration with Nazis.)

Not only do I not understand how the Landauer limit works, I don’t even know what it is.

Points for honesty, I guess? But also demerits for not at least reading the Wikipedia article. Rationalists are so quick to write paragraphs explaining that they didn't read paragraphs.

[-] gerikson@awful.systems 5 points 1 year ago

Re: Coco Chanel, it's an uncomfortable fact that huge swaths of French society (particularly the more conservative parts) were quite OK with German involvement in French governance, at least until the forced labor requirements sending people to work in Germany. The Third Republic was hardly a model social democracy and if the Nazis hadn't been such incompetent overlords we might have seen a coal and steel union decades before it happened, with Vichy France being an integral part of a Nazi-led European union.

Instead the Nazis looted most of France and made it quite clear that the French were going to be second-class citizens forever, and once they started looking less unbeatable everyone was part of the Resistance.

[-] dgerard@awful.systems 7 points 1 year ago

look at the British aristocracy, who are still full of fucking Nazis

[-] gerikson@awful.systems 3 points 1 year ago

For sure, there were many who would have prepared to cut a deal with Hitler - let him have Europe (and to hell with the strategy of not letting any major power dominate there) in return for the inviolability of the empire.

[-] dgerard@awful.systems 3 points 1 year ago

i mean they were huge fans and think Britain joined the wrong side of the war

[-] froztbyte@awful.systems 4 points 1 year ago

I recall reading a great twitter thread a while back that covered a lot of the nazi interplay in british high society in the years prior to ww2. really should re-find that and get it archived (and/or find some other primary sources to read about it)

[-] swlabr@awful.systems 9 points 1 year ago

All right, pack it up boys, guess we don’t need the ol’ sneer club no more.

[-] dgerard@awful.systems 11 points 1 year ago

oh they run one of these "wait ... this sucks" posts every year or so. don't worry, they never have the slightest effect. and notice how the guy still thinks scott is a good poster.

[-] Evinceo@awful.systems 6 points 1 year ago

He said he's fed up with Yudkowsky, not fed up with Moldbug.

[-] dgerard@awful.systems 3 points 1 year ago

he still thinks Scott is awesome for instance

[-] swlabr@awful.systems 6 points 1 year ago

I don't know whether I should feel happy that I will have a sustainable snark receptacle for the near future or sad that the basilisk won't eventually consume itself tail-first.

[-] froztbyte@awful.systems 6 points 1 year ago

with all this saltiness about it, the basilisk might be inedible at this point. or at least an acquired taste.

[-] mwenge@mastodon.social 7 points 1 year ago

@dgerard "In the days of my youth, about two years ago.." 😂

[-] dgerard@awful.systems 7 points 1 year ago

it's important to remember that the rationalist recruits tend terrifyingly young.

[-] bitofhope@awful.systems 8 points 1 year ago

On the flipside, there's hope for many of them to grow and reconsider their positions. Whomst among us wasn't a dumbass as a teenager?

[-] jonhendry@awful.systems 5 points 1 year ago

Not like that.

[-] zogwarg@awful.systems 5 points 1 year ago* (last edited 1 year ago)

It was so funny that I read it as an intentionnal self-deprecating (and by extension towards the community) joke, maybe I am too optimistic towards human nature ^^ (especically of the Ratty kind).

[-] Architeuthis@awful.systems 7 points 1 year ago* (last edited 1 year ago)

Not sure if it's a NSFW assertion, but to me the p-zombie experiment seems like the result of a discourse that went off the rails very early and very hard into angels on the head of a pin territory, this lw post notwithstanding.

Like, as far as I can tell, imagining a perfectly cloned reality except with the phenomenon in question assumed away, is supposedly (metaphysical) evidence that the phenomenon exists, except in a separate ontology? Isn't this basically like using reverse Occam's razor to prove that the extra entities are actually necessary, at least as long as they somehow stay mostly in their own universe?

Plus, the implicit assumption that consciousness can be defined as some sort of singular and uniform property you either have or don't seems inherently dodgy and also to be at the core of the contradiction; like, is taking p-zombies too seriously a reaction specifically to a general sense of disappointment that a singular consciousness organelle is nowhere to be found?

[-] scruiser@awful.systems 4 points 1 year ago

I kinda agree with, but the post does correctly point out the Eliezer ignored a lot of the internal distinctions between philosophical positions and ignored how the philosophers use their own terminology. So even though I also think p-zombies are ultimately an incoherent thought experiment I don't think Eliezer actually did a good job addressing them.

[-] zogwarg@awful.systems 4 points 1 year ago

Careful, you're agreeing with Yud there! ^^
There's a reason it's not a completely dead topic in philosophy, since it's at least somewhat of an interesting question as relates to monism/physicalism vs dualism (more than trying to litigate any specific organelle, or of the presence or lack thereof in animals). Though a much shorter debunking of Yud's views is the possibility of p-zombies is an unfalsifiable premise, thus claiming any foolproof counter is not so Rational™ (Although by the same token, it is indeed a topic of ultimately limited value)

[-] swlabr@awful.systems 6 points 1 year ago

If it weren't for the fact that many people in the ratspace are privileged, I'd feel sad for them. They are frogs in a well and crabs in a bucket. They exist in a solipsistic pit, thinking their worldview is built from pure logic and not their individual experience. This is all well-trodden ground- we know LW et al. is a cult. Such a strange and specific way to hamstring yourself, to self-lobotomise. Something something Plato's cave, qualia, lobster social hierarchy reference.

[-] Evinceo@awful.systems 5 points 1 year ago

In my final year of high school debate

Most self aware rationalist.

I’m not an expert about X, but it seems like most of the experts about X think X or are unsure about it. The fact that Eliezer, who often veers sharply off-the-rails, thinks X gives me virtually no evidence about X. Eliezer, while being quite smart, is not rational enough to be worthy of significant deference on any subject, especially those subjects outside his area of expertise. Still though, he has some interesting things to say about AI and consequentialism that are sort of convincing. So it’s not like he’s wrong about everything or is a total crank. But he’s wrong enough, in sufficiently egregious ways, that I don’t really care what he thinks.

So close to being deprogrammed. So close. It's like when a kid finds out about the Easter Bunny but somehow still clings to Santa.

He links to this (warning, so long it has a whole 'why write this' section) article on Yudkowsky being wrong which amuses me.

Making basic errors that show you don’t have the faintest grasp on what people are arguing about, and then acting like the people who take the time to get Ph.Ds and don’t end up agreeing with your half-baked arguments are just too stupid to be worth listening to is outrageous.

This, but for AI lol.

If anyone would like to have a debate about this on YouTube...

LW equivalent of fight me irl bro

[-] self@awful.systems 8 points 1 year ago

In the days of my youth, about two years ago, I was a big fan of Eliezer Yudkowsky.

In fact, Eliezer’s memorable phrasing that the many worlds interpretation “wins outright given the current state of evidence,” was responsible for the title of my 44-part series arguing for utilitarianism titled “Utilitarianism Wins Outright.”

this poster accidentally paints such an accurate picture of the average young rationalist you can almost taste it (and it isn’t delicious)

also, I’m no physicist, but the quote about MWI winning outright has always struck me as an extremely poor approach to science, especially given (to my current knowledge at least, physicists please correct me) the lack of solid proof pointing to MWI being correct. like a lot of things, yud seems to like MWI because a multiverse is a fun base for ~~a pseudoscientific cult~~ his Harry Potter fanfiction. the other quotes in this post don’t do any better, even when the poster is trying to use them to complement yud.

that this poster took a shitty quote about yud doing science poorly and turned it into a 44-part series named Utilitarianism Wins Outright is just chef’s kiss

[-] naevaTheRat@lemmy.dbzer0.com 4 points 1 year ago

Failed physicist here: Collapse interpretation always seemed a bit unscientific in general to me. I am quite possibly wrong because it's not my field but I haven't seen any currently testable hypotheses come out of it.

There's not zero merit in this sort of galaxy brain thinking and it's satisfying to have some kind of model rather than just a series of disjointed facts but the polarization of amateurs on this always seemed strange to me. Like sportsball fans thinking the other teams want to kill each rather than the event being mutual play.

I've never met someone who actually does physics that had a very strong opinion one way or the other. A lot of "MWI seems elegant but we can't know yet" or "Collapse is a bit weird and unsatisfying isn't it?". Maybe when you get to the giganerds and their chalkboards the shivs come out but I've seen no evidence.

Besides, we should all be focusing on how time is mathematically hideous and thus clearly not fundamental.

[-] titotal@awful.systems 9 points 1 year ago* (last edited 1 year ago)

Current physicist here: yeah, most physicists are in the "shut up and calculate" camp, and view the interpretations as fun lunchroom conversation.

I also think that collapse is unsatisfying, and I think yud did an adequate job in relaying the reasons why a lot of physicists are unhappy with it. The problem is that "collapse is unsatisfying" is not sufficient evidence to declare that MWI is true and that MWI nonbelievers are fools. The obvious point being that there a shitload of other interpretations which neither feature many-worlds or "real" collapse. The other point is that MWI is an incomplete theory, as there are no explanation for the Born probabilities. Also, we know we don't have the full picture of quantum physics anyway (as it's incompatible with general relativity), so it's possible that if we figure out a unified theory the problems with interpretations will go away.

[-] naevaTheRat@lemmy.dbzer0.com 4 points 1 year ago

Yeah, I think its "science enthusiast" popularity is mostly because it can be misunderstood to imply that a bunch of SciFi pop culture is more plausible than it is. Couple that with the nuances of various interpretations being lost on anyone who hasn't actually done the maths and you have a recipe for Batman vs Superman type disagreements.

I threw electrons at colour centres on nanodiamonds to try make them more nano-er so while quantum shit was involved I never felt particularly compelled to have a high degree of certainty in the (is epistemology the right word?) behind the maths that obviously worked.

[-] dgerard@awful.systems 5 points 1 year ago

I note the objection is not so much to Yudkowsky's particular opinions, but that Yudkowsky clearly doesn't understand the questions or indeed some of the terms. (This is made clearer in the comments on the blog post version, which appear to come from actual philosophers.)

[-] mutual_ayyde@mastodon.social 5 points 1 year ago

@dgerard reading the bit on decision theory and I'm reminded of that one anecdote about the decision theorist who asked their friend if they should propose to their wife.

[-] mutual_ayyde@mastodon.social 7 points 1 year ago

@dgerard I'm sure in some domains decision theory works fine but lordy for actual practical life advice just read some virtue ethicist, the solution to being dutch book'd isn't becoming a perfect bayesian its to not take stupid bets

[-] gerikson@awful.systems 4 points 1 year ago* (last edited 1 year ago)

Next time someone confidently suggests Yud's work I'll point them to this. Dueling argumentum ad auctoritatem! May the biggest nerd win (spoiler: it's Yud, he's the biggest)

this post was submitted on 27 Aug 2023
25 points (100.0% liked)

SneerClub

983 readers
1 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS