[-] scruiser@awful.systems 14 points 3 weeks ago* (last edited 3 weeks ago)

I've seen this concept mixed with the simulation "hypothesis". The logic goes that if future simulators are running a "rescue simulation" but only cared (or at least cared more) about the interesting or more agentic people (i.e. rich/white/westerner/lesswronger), they might only fully simulate those people and leave simpler nonsapient scripts/algorithms piloting the other people (i.e. poor/irrational/foreign people).

So basically literally positing a mechanism by which they are the only real people and other people are literally NPCs.

[-] scruiser@awful.systems 14 points 1 month ago* (last edited 1 month ago)

Lesswrong and SSC: capable of extreme steelmanning of... check notes... occult mysticism (including divinatory magic), Zen-Buddhism based cults, people who think we should end democracy and have kings instead, Richard Lynn, Charles Murray, Chris Langan, techbros creating AI they think is literally going to cause mankind's extinction...

Not capable of even a cursory glance into their statements, much less steelmanning: sneerclub, Occupy Wallstreet

[-] scruiser@awful.systems 13 points 1 month ago* (last edited 1 month ago)

I feel like lesswrong's front page has what would be a neat concept in a science fiction story at least once a week. Like what if an AGI had a constant record of it's thoughts, but it learned to hide what it was really thinking in them with complex stenography! That's a solid third act twist of at least a B sci-fi plot, if not enough to carry a good story by itself. Except lesswrong is trying to get their ideas passed in legislation and they are being used as the hype wing of the latest tech-craze. And they only occasionally write actually fun stories, as opposed to polemic stories beating you over the head with their moral or ten thousand word pseudo-academic blog posts.

[-] scruiser@awful.systems 14 points 3 months ago* (last edited 3 months ago)

He's set up a community primed to think the scientific establishment's focus on falsifiablility and peer review is fundamentally worse than "Bayesian" methods, and that you don't need credentials or even conventional education or experience to have revolutionary good ideas, and strengthened the already existing myth of lone genii pushing science forward (as opposed to systematic progress). Attracting cranks was an inevitable outcome. In fact, Eliezer occasionally praises cranks when he isn't able to grasp their sheer crankiness (for instance, GeneSmith's ideas are total nonsense for anyone with more familiarity with genetics than skimming relevant-sounding scientific publications and garbage pop-sci journalism, but Eliezer commented favorably). The only thing that has changed is ChatGPT and it's clones glazing cranks first making them even more deluded. And of course, someone (cough Eliezer) was hyping up ChatGPT as far back as GPT-2, so it's only to be expected that cranks would think LLMs were capable of providing legitimate useful feedback.

Not a fan of yud but getting daily emails from delulus would drive me to wish for the basilisk

He's deliberately cultivated an audience willing to hear cranks out, so this is exactly what he deserves.

[-] scruiser@awful.systems 13 points 5 months ago

His fears are my hope, that Trump fucking up hard enough will send the pendulum of public opinion the other way (and then the Democrats use that to push some actually leftist policies through... it's a hope not an actual prediction).

He cultivated this incompetence and worshiped at the altar of the Silicon Valley CEO, so seeing him confronted with Elon's and Trump's clumsy incompetence is some nice schadenfreude.

[-] scruiser@awful.systems 14 points 7 months ago* (last edited 7 months ago)

Soyweiser has likely accurately identified that you're JAQing in bad faith, but on the slim off chance you actually want to educate yourself, the rationalwiki page on Biological Determinism and Eugenics is a decent place to start to see the standard flaws and fallacies used to argue for pro-eugenic positions. Rationalwiki has a scathing and sarcastic tone, but that tone is well deserved in this case.

To provide a brief summary, in general, the pro-eugenicists misunderstand correlation and causation, misunderstand the direction of causation, overestimate what little correlation there actually is, fail to understand environmental factors (especially systemic inequalities that might require leftist solutions to actually have any chance at fixing), and refuse to acknowledge the context of genetics research (i.e. all the Neo-Nazis and alt righters that will jump on anything they can get).

The lesswrongers and SSCs sometimes whine they don't get fair consideration, but considering they take Charles Murray the slightest bit seriously they can keep whining.

[-] scruiser@awful.systems 13 points 1 year ago

His replies have gone up in upvotes substantially since yesterday, so it looks like a bit of light brigading is going on.

[-] scruiser@awful.systems 13 points 1 year ago

Reddit can be really hit or miss, but I'm glad subredditdrama and /r/wikipedia aren't buying TWG's bullshit. Well, some of the /r/wikipedia assume TWG is merely butthurt over losing edit wars as opposed to a more advanced agenda, but that is fair of them.

[-] scruiser@awful.systems 14 points 1 year ago

Nice effort post! It feels like the LLM is pattern matching to common logic tests even when that is the totally incorrect thing to do. Which is pretty strong evidence against LLM's properly doing reasoning as opposed to getting logic test and puzzles and benchmarks right through sheer memorization and pattern matching.

[-] scruiser@awful.systems 13 points 1 year ago

Roko is also violating their rules of assuming charitably and good faith about everything and going meta whenever possible. Because defending racists and racism is fine, as long as your tone is careful enough and you go up a layer of meta to avoid discussing the object level claims.

[-] scruiser@awful.systems 13 points 1 year ago* (last edited 1 year ago)

I don't think even that does it. Richard Hanania, one of Manifest's promoted speakers, wrote "Why Do I Hate Pronouns More Than Genocide?".

[-] scruiser@awful.systems 13 points 2 years ago

The hilarious part to me is that they imagine Eliezer moderates himself or self-censors particularly in response to sneerclub. Like of all the possible reasons why Eliezer may not want to endorse transphobic rhetoric about pronouns (concern about general PR besides sneerclub, a more complex nuanced understanding of language, or even genuine compassion for trans people), sneerclubs disapproval is the one that sticks out to the author. I guess good job on us? Keep it up!

view more: ‹ prev next ›

scruiser

joined 2 years ago