[-] swlabr@awful.systems 12 points 2 months ago

Haven’t read the article yet but I can only assume the book is “Lolita X” where they find the cryonically frozen body of humbert humbert and they bring him to space

[-] swlabr@awful.systems 12 points 3 months ago

I'm sure ol' Petey Tooter has done the calculus to justify why it's worth pouring out a gallon of water every time someone wants to use a worse ctrl+f on his collected works.

[-] swlabr@awful.systems 12 points 7 months ago

Honestly that’s an edit distance of like 2 from too many trashy harem isekai

[-] swlabr@awful.systems 12 points 7 months ago

The very same!

[-] swlabr@awful.systems 12 points 7 months ago

ah you see cryonics does increase life expenctancy, i.e. E(life length). As long as P(cryobubonics works) > 0, which, according to Yudkowskian Probability Theory, is true for any probability, then E(life length) = infinity, since cryonica will let us live forever /big fat fucking S

[-] swlabr@awful.systems 12 points 10 months ago

I initially deleted this because these ideas aren’t really that fascist, so I was gonna take a second pass. Then I remembered that thinking of fascist ideas makes me feel awful so here we are.

[-] swlabr@awful.systems 12 points 1 year ago* (last edited 1 year ago)

If Yud did his cult sermons IRL:

yud walks out onto stage, wearing a hat with a 🚫 sign with a period under it

Crowd: “Leader is wearing the silly hat!!! It’s a silly sermon! Prepare to laugh!!!!”

Yud: “There should be a gulag for low IQ people and Swedes!”

Crowd: “Amazing joke, leader! We must now justify the humor!!!”

[-] swlabr@awful.systems 12 points 1 year ago

my reference point for this kind of extension is the one that changes “social justice” and “sjw” with “skeleton” and “skeleton warrior.” For example:

“sjws are taking over X” -> “skeleton warriors are taking over X”

Actually now that I’m typing this I hope there’s a good one for “woke”.

[-] swlabr@awful.systems 12 points 1 year ago* (last edited 1 year ago)

actually I found this ancient text online that helps explain everything:

Eliezer Monogatari, chapter 69:

On his mighty steed Oververbosity, Eliezer is fast approaching the cliff edge overlooking the dreaded Chasm of Unreadability. His desire to cram more words and IQfulness into his text has deafened him to the wails of the Ghost of Editors Past, begging him to at least paste into Google Docs and fix a few squigglies.

In a moment of rational brilliance, he casts Bayesian Reasoning and epiphanisationalises that he can Just Add Metainfo to every line, exponentially increasing his QLAWPW (quality of life adjusted words per word), making him The Greatest Author. His loyal mount leaps into the air over the abyss, buoyed by clouds of self-delusion.

Of course, he crashes and falls into the chasm. But what's this? The miasma of unreadability... it's more clouds of self-delusion generated by the cult of TREACLES! He rises, unscathed, head three sizes bigger than before.^1^

1: Scholars are divided on whether this is because the toxic miasma mutated his sense of self-worth or if he hit his head on the chasm floor and it swelled. We will never know.

[-] swlabr@awful.systems 12 points 1 year ago* (last edited 1 year ago)

There are some follow up tweets.

To the commenters shocked that Zvi’s probability numbers don’t come from an “objective calculation”:

Not shocked. At this point I am numb to it though.

  1. Approximating numerical probabilities is required if you want to make coherent decisions.

Average bayesist when they need to pee:

“Hmm, I suspect I need to pee. The decision I now must make is whether or not I should use my toilet or my sink.”

half an hour of examining priors, searching LW for urination sequences and setting up a prediction market later

“Hmm, it appears I have peed my pants.”

  1. Probability is subjective (a property of our belief-state) for us Bayesians.

Oh my god, he admit it!

[-] swlabr@awful.systems 12 points 1 year ago* (last edited 1 year ago)

Is it just me or does the author just… not really spend any time trying to defend forced birth? Like, other than quoting counterarguments to abortion defences. It’s like he’s sort of assuming everyone already has ideas about why abortion itself is bad, but find it permissible for whatever reason. Is this a correct characterisation of the EA community? That they all harbour anti-abortion sentiment but for whatever reason permit abortion?

Overall it reads like a business proposal. Is this how you’re supposed to talk to an EA person? Instead of saying “here is why you should care about x”, you have to pitch them on the potential ROI of caring about something? If so, that’s a fucking frustrating way to think about the world, and this was a fucking awful article to read, just like every other treacles-y long form logorrhoea you get from these people.

[-] swlabr@awful.systems 12 points 1 year ago

It's cute how Yud is trying to construct a dichotomy between EA AI X-riskers and e/acc AI chuds like Andreessen, as if one side is correct and the other isn't. Call me a centrist if you like: both sides bad.

view more: ‹ prev next ›

swlabr

joined 2 years ago