[-] scruiser@awful.systems 15 points 3 weeks ago

You need to translate them into lesswrongese before you try interpreting them together.

probability: he made up a number to go with his feelings about a topic

subjective: the number is even more made up and feelings based than is normal for lesswrong

noticeable: the number is really tiny, but big enough for Eliezer to fearmonger about!

No, you don't get to actually know what the number is, then you could penalize Eliezer for predicting it wrongly or question why that number specifically. Just trust that the bayesianified language shows Eliezer thought really hard about it.

[-] scruiser@awful.systems 18 points 1 month ago

Keep in mind the author isn't just (or even primarily) counting ultra wealth and establishment politicians as "elites", they are also including scientists trying to educate the public on their area of expertise (i.e. COVID, Global Warming, Environmentalism, etc.), and sociologists/psychologists explaining problems the author wants to ignore or are outright in favor of (racism/transphobia/homophobia).

[-] scruiser@awful.systems 15 points 1 month ago

No, he's in favor of human slavery, so he still wants to keep naming schemes evocative of it.

[-] scruiser@awful.systems 13 points 1 month ago

His fears are my hope, that Trump fucking up hard enough will send the pendulum of public opinion the other way (and then the Democrats use that to push some actually leftist policies through... it's a hope not an actual prediction).

He cultivated this incompetence and worshiped at the altar of the Silicon Valley CEO, so seeing him confronted with Elon's and Trump's clumsy incompetence is some nice schadenfreude.

[-] scruiser@awful.systems 15 points 2 months ago* (last edited 2 months ago)

My favorite comment in the lesswrong discussion: https://www.lesswrong.com/posts/DfrSZaf3JC8vJdbZL/how-to-make-superbabies?commentId=oyDCbGtkvXtqMnNbK

It's not that eugenics is a magnet for white supremacists, or that rich people might give their children an even more artificially inflated sense of self-worth. No, the risk is that the superbabies might be Khan and kick start the eugenics wars. Of course, this isn't a reason not to make superbabies, it just means the idea needs some more workshopping via Red Teaming (hacker lingo is applicable to everything).

[-] scruiser@awful.systems 14 points 2 months ago* (last edited 2 months ago)

Soyweiser has likely accurately identified that you're JAQing in bad faith, but on the slim off chance you actually want to educate yourself, the rationalwiki page on Biological Determinism and Eugenics is a decent place to start to see the standard flaws and fallacies used to argue for pro-eugenic positions. Rationalwiki has a scathing and sarcastic tone, but that tone is well deserved in this case.

To provide a brief summary, in general, the pro-eugenicists misunderstand correlation and causation, misunderstand the direction of causation, overestimate what little correlation there actually is, fail to understand environmental factors (especially systemic inequalities that might require leftist solutions to actually have any chance at fixing), and refuse to acknowledge the context of genetics research (i.e. all the Neo-Nazis and alt righters that will jump on anything they can get).

The lesswrongers and SSCs sometimes whine they don't get fair consideration, but considering they take Charles Murray the slightest bit seriously they can keep whining.

[-] scruiser@awful.systems 14 points 11 months ago

Nice effort post! It feels like the LLM is pattern matching to common logic tests even when that is the totally incorrect thing to do. Which is pretty strong evidence against LLM's properly doing reasoning as opposed to getting logic test and puzzles and benchmarks right through sheer memorization and pattern matching.

[-] scruiser@awful.systems 23 points 11 months ago

Which, to recap for everyone, involved underpaying and manipulating employees into working as full time general purpose servants. Which is pretty up there on the scale of cult-like activity out of everything EA has done. So it makes sense she would be trying to pull a switcheroo as to who is responsible for EA being culty...

[-] scruiser@awful.systems 13 points 11 months ago

Roko is also violating their rules of assuming charitably and good faith about everything and going meta whenever possible. Because defending racists and racism is fine, as long as your tone is careful enough and you go up a layer of meta to avoid discussing the object level claims.

[-] scruiser@awful.systems 16 points 11 months ago

Did you misread or are you making a joke (sorry the situation is so absurd its hard to tell)? Curtis Yarvin is Moldbug, and he was the one hosting the afterparty (he didn't attend the Manifest conference himself). So apparently there were racists too cringy even for Moldbug-hosted parties!

[-] scruiser@awful.systems 14 points 1 year ago* (last edited 1 year ago)

ghost of 2007!Yud

This part gets me the most. The current day Yud isn't transphobic (enough? idk) so Zack has to piece together his older writings on semantics and epistemology to get a more transphobic gender essentialist version of past Yud.

[-] scruiser@awful.systems 13 points 1 year ago

The hilarious part to me is that they imagine Eliezer moderates himself or self-censors particularly in response to sneerclub. Like of all the possible reasons why Eliezer may not want to endorse transphobic rhetoric about pronouns (concern about general PR besides sneerclub, a more complex nuanced understanding of language, or even genuine compassion for trans people), sneerclubs disapproval is the one that sticks out to the author. I guess good job on us? Keep it up!

view more: ‹ prev next ›

scruiser

joined 2 years ago