18

In a recent Hard Fork (Hard Hork?) episode, Casey Newton and Kevin Roose described attending the recent "The Curve" conference -- a conference in Berkeley organized and attended mostly by our very best friends. When asked about the most memorable session he attended at this conference, Casey said:

That would have been a session called If Anyone Builds It, Everyone Dies, which was hosted by Eliezer Yudkowski. Eliezer is sort of the original doomer. For a couple of decades now, he has been warning about the prospects of super intelligent AI.

His view is that there is almost no scenario in which we could build a super intelligence that wouldn't either enslave us or hurt us, kill all of us, right? So he's been telling people from the beginning, we should probably just not build this. And so you and I had a chance to sit in with him.

People fired a bunch of questions at him. And we should say, he's a really polarizing figure, and I think is sort of on one extreme of this debate. But I think he was also really early to understanding a lot of harms that have bit by bit started to materialize.

And so it was fascinating to spend an hour or so sitting in a room and hearing him make his case.

[...]

Yeah, my case for taking these folks seriously, Kevin, is that this is a community that, over a decade ago, started to make a lot of predictions that just basically came true, right? They started to look at advancements in machine learning and neural networks and started to connect the dots. And they said, hey, before too long, we're going to get into a world where these models are incredibly powerful.

And all that stuff just turned out to be true. So, that's why they have credibility with me, right? Everything they believe, you know, we could hit some sort of limit that they didn't see coming.

Their model of the world could sort of fall apart. But as they have updated it bit by bit, and as these companies have made further advancements and they've built new products, I would say that this model of the world has basically held so far. And so, if nothing else, I think we have to keep this group of folks in mind as we think about, well, what is the next phase of AI going to look like for all of us?

32

Excerpt:

A new study published on Thursday in The American Journal of Psychiatry suggests that dosage may play a role. It found that among people who took high doses of prescription amphetamines such as Vyvanse and Adderall, there was a fivefold increased risk of developing psychosis or mania for the first time compared with those who weren’t taking stimulants.

Perhaps this explains some of what goes on at LessWrong and in other rationalist circles.

23

Maybe she was there to give Moldbug some relationship advice.

[-] TinyTimmyTokyo@awful.systems 15 points 5 months ago* (last edited 5 months ago)

I'm noticing that people who criticize him on that subreddit are being downvoted, while he's being upvoted.

I wouldn't be surprised if, as part of his prodigious self-promotion of this overlong and tendentious screed, he's steered some of his more sympathetic followers to some of these forums.

Actually it's the wikipedia subreddit thread I meant to refer to.

[-] TinyTimmyTokyo@awful.systems 16 points 5 months ago

Trace seems a bit... emotional. You ok, Trace?

[-] TinyTimmyTokyo@awful.systems 27 points 5 months ago

So now Steve Sailer has shown up in this essay's comments, complaining about how Wikipedia has been unfairly stifling scientific racism.

Birds of a feather and all that, I guess.

[-] TinyTimmyTokyo@awful.systems 16 points 5 months ago

what is the entire point of singling out Gerard for this?

He's playing to his audience, which includes a substantial number of people with lifetime subscriptions to the Unz Review, Taki's crapazine and Mankind Quarterly.

[-] TinyTimmyTokyo@awful.systems 20 points 5 months ago

why it has to be quite that long

Welcome to the rationalist-sphere.

[-] TinyTimmyTokyo@awful.systems 27 points 5 months ago

Scott Alexander, by far the most popular rationalist writer besides perhaps Yudkowsky himself, had written the most comprehensive rebuttal of neoreactionary claims on the internet.

Hey Trace, since you're undoubtedly reading this thread, I'd like to make a plea. I know Scott Alexander Siskind is one of your personal heroes, but maybe you should consider digging up some dirt in his direction too. You might learn a thing or two.

[-] TinyTimmyTokyo@awful.systems 16 points 6 months ago

Until a month ago, TW was the long-time researcher for "Blocked and Reported", the podcast hosted by Katie 'TERF' Herzog and relentless sealion Jesse Singal.

31
OK doomer (www.newyorker.com)

The New Yorker has a piece on the Bay Area AI doomer and e/acc scenes.

Excerpts:

[Katja] Grace used to work for Eliezer Yudkowsky, a bearded guy with a fedora, a petulant demeanor, and a p(doom) of ninety-nine per cent. Raised in Chicago as an Orthodox Jew, he dropped out of school after eighth grade, taught himself calculus and atheism, started blogging, and, in the early two-thousands, made his way to the Bay Area. His best-known works include “Harry Potter and the Methods of Rationality,” a piece of fan fiction running to more than six hundred thousand words, and “The Sequences,” a gargantuan series of essays about how to sharpen one’s thinking.

[...]

A guest brought up Scott Alexander, one of the scene’s microcelebrities, who is often invoked mononymically. “I assume you read Scott’s post yesterday?” the guest asked [Katja] Grace, referring to an essay about “major AI safety advances,” among other things. “He was truly in top form.”

Grace looked sheepish. “Scott and I are dating,” she said—intermittently, nonexclusively—“but that doesn’t mean I always remember to read his stuff.”

[...]

“The same people cycle between selling AGI utopia and doom,” Timnit Gebru, a former Google computer scientist and now a critic of the industry, told me. “They are all endowed and funded by the tech billionaires who build all the systems we’re supposed to be worried about making us extinct.”

35

In her sentencing submission to the judge in the FTX trial, Barbara Fried argues that her son is just a misunderstood altruist, who doesn't deserve to go to prison for very long.

Excerpt:

One day, when he was about twelve, he popped out of his room to ask me a question about an argument made by Derik Parfit, a well-known moral philosopher. As it happens, | am quite familiar with the academic literature Parfi’s article is a part of, having written extensively on related questions myself. His question revealed a depth of understanding and critical thinking that is not all that common even among people who think about these issues for a living. ‘What on earth are you reading?” I asked. The answer, it turned out, was he was working his way through the vast literature on utiitarianism, a strain of moral philosophy that argues that each of us has a strong ethical obligation to live so as to alleviate the suffering of those less fortunate than ourselves. The premises of utilitarianism obviously resonated strongly with what Sam had already come to believe on his own, but gave him a more systematic way to think about the problem and connected him to an online community of like-minded people deeply engaged in the same intellectual and moral journey.

Yeah, that "online community" we all know and love.

[-] TinyTimmyTokyo@awful.systems 20 points 10 months ago

You know the doom cult is having an effect when it starts popping up in previously unlikely places. Last month the socialist magazine Jacobin had an extremely long cover feature on AI doom, which it bought into completely. The author is an effective altruist who interviewed and took seriously people like Katja Grace, Dan Hendrycks and Eliezer Yudkosky.

I used to be more sanguine about people's ability to see through this bullshit, but eschatological nonsense seems to tickle something fundamentally flawed in the human psyche. This LessWrong post is a perfect example.

[-] TinyTimmyTokyo@awful.systems 15 points 10 months ago

Happy Valentine's Day everybody!

71
submitted 11 months ago* (last edited 11 months ago) by TinyTimmyTokyo@awful.systems to c/sneerclub@awful.systems

Pass the popcorn, please.

(nitter link)

[-] TinyTimmyTokyo@awful.systems 20 points 11 months ago

Imagine thinking there is actually some identifiable thing called "white culture". As if a skin color defines a culture.

Yeah, sounds like a Nazi.

[-] TinyTimmyTokyo@awful.systems 17 points 1 year ago

What a bunch of monochromatic, hyper-privileged, rich-kid grifters. It's like a nonstop frat party for rich nerds. The photographs and captions make it obvious:

The gang going for a hiking adventure with AI safety leaders. Alice/Chloe were surrounded by a mix of uplifting, ambitious entrepreneurs and a steady influx of top people in the AI safety space.

The gang doing pool yoga. Later, we did pool karaoke. Iguanas everywhere.

Alice and Kat meeting in “The Nest” in our jungle Airbnb.

Alice using her surfboard as a desk, co-working with Chloe’s boyfriend.

The gang celebrating… something. I don’t know what. We celebrated everything.

Alice and Chloe working in a hot tub. Hot tub meetings are a thing at Nonlinear. We try to have meetings in the most exciting places. Kat’s favorite: a cave waterfall.

Alice’s “desk” even comes with a beach doggo friend!

Working by the villa pool. Watch for monkeys!

Sunset dinner with friends… every day!

These are not serious people. Effective altruism in a nutshell.

24

They've been pumping this bio-hacking startup on the Orange Site (TM) for the past few months. Now they've got Siskind shilling for them.

42
Effective Obfuscation (newsletter.mollywhite.net)

Molly White is best known for shining a light on the silliness and fraud that are cryptocurrency, blockchain and Web3. This essay may be a sign that she's shifting her focus to our sneerworthy friends in the extended rationalism universe. If so, that's an excellent development. Molly's great.

16
submitted 1 year ago* (last edited 1 year ago) by TinyTimmyTokyo@awful.systems to c/sneerclub@awful.systems

Not 7.5% or 8%. 8.5%. Numbers are important.

[-] TinyTimmyTokyo@awful.systems 15 points 1 year ago* (last edited 1 year ago)

Roko's authoritative-toned "aktshually..." response to Annie's claims have me fuming. I don't know why. I mean I've known for years that this guy is a total boil on the ass of humanity. And yet he still manages to shock with the worst possible take on a topic -- even when the topic is sexual abuse of a child. If, like Roko, I were to play armchair psychiatrist, I'd diagnose him as a sociopath with psychopathic tendencies. But I'm not. So I won't.

57
submitted 1 year ago* (last edited 1 year ago) by TinyTimmyTokyo@awful.systems to c/sneerclub@awful.systems

Rationalist check-list:

  1. Incorrect use of analogy? Check.
  2. Pseudoscientific nonsense used to make your point seem more profound? Check.
  3. Tortured use of probability estimates? Check.
  4. Over-long description of a point that could just have easily been made in 1 sentence? Check.

This email by SBF is basically one big malapropism.

1
submitted 1 year ago* (last edited 1 year ago) by TinyTimmyTokyo@awful.systems to c/sneerclub@awful.systems

[All non-sneerclub links below are archive.today links]

Diego Caleiro, who popped up on my radar after he commiserated with Roko's latest in a never-ending stream of denials that he's a sex pest, is worthy of a few sneers.

For example, he thinks Yud is the bestest, most awesomest, coolest person to ever breathe:

Yudkwosky is a genius and one of the best people in history. Not only he tried to save us by writing things unimaginably ahead of their time like LOGI. But he kind of invented Lesswrong. Wrote the sequences to train all of us mere mortals with 140-160IQs to think better. Then, not satisfied, he wrote Harry Potter and the Methods of Rationality to get the new generation to come play. And he founded the Singularity Institute, which became Miri. It is no overstatement that if we had pulled this off Eliezer could have been THE most important person in the history of the universe.

As you can see, he's really into superlatives. And Jordan Peterson:

Jordan is an intellectual titan who explores personality development and mythology using an evolutionary and neuroscientific lenses. He sifted through all the mythical and religious narratives, as well as the continental psychoanalysis and developmental psychology so you and I don’t have to.

At Burning Man, he dons a 7-year old alter ego named "Evergreen". Perhaps he has an infantilization fetish like Elon Musk:

Evergreen exists ephemerally during Burning Man. He is 7 days old and still in a very exploratory stage of life.

As he hinted in his tweet to Roko, he has an enlightened view about women and gender:

Men were once useful to protect women and children from strangers, and to bring home the bacon. Now the supermarket brings the bacon, and women can make enough money to raise kids, which again, they like more in the early years. So men have become useless.

And:

That leaves us with, you guessed, a metric ton of men who are no longer in families.

Yep, I guessed about 12 men.

2
view more: next ›

TinyTimmyTokyo

joined 1 year ago