[-] swlabr@awful.systems 11 points 7 months ago

Yeah while they’re at it, they should water cool the oceans.

[-] swlabr@awful.systems 11 points 11 months ago* (last edited 11 months ago)

Here’s my cocktail suggestion:

50 Shades (Sam Bankman) Fried:

Ingredients:

  • 50 different liquors/liqueurs, as many different densities and colours as possible

Equipment:

  • Transparent cylindrical urn with spigot

Steps:

  1. Sort the ingredients in decreasing order or liquid density.
  2. Pour 1/50th of the container capacity of each liquid in order into container, slowly and carefully to retain layers.

(Do not actually make this)

[-] swlabr@awful.systems 11 points 11 months ago

Despite its title this isn't meant to be the best essay

Oh, thanks for clarifying, that was really hard to figure out. /s

[-] swlabr@awful.systems 11 points 11 months ago

Y'all have nailed how I felt reading this. You have this fellow who has constructed a cognitive process to self-harm at every juncture. Whenever there is an opportunity to believe in something that could bring them joy or closure etc., they head in the opposite direction because of some tenet they think is "correct" or "obvious" even though said tenet usually has a paper-thin justification. Davis is haunted by an Imp of the perverse; that imp is named Yudkowsky.

[-] swlabr@awful.systems 11 points 1 year ago

“I’m sorry, I found someone with a 20000% higher PB on the original Donkey Kong cabinet, it’s over. I’m taking the kids.”

[-] swlabr@awful.systems 11 points 1 year ago

@self@awful.systems interest check thread for an airport book industrial complex sneer instance?

[-] swlabr@awful.systems 11 points 1 year ago

To unpack the post a bit:

So my understanding is that Yud is convinced that the inscrutable matrices (note: just inscrutable to him) in his LLM have achieved sentience. In his near-future world where AI can exert itself in the physical world at will and, in particular, transfer data into your body, what possible use does it have for a bitcoin? What possible benefit would come from reprogramming human DNA beyond the intellectual challenge? I've recently been thinking about how Yud is supposedly the canonical AI-doomer, but his (and the TESCREAL community in general's) AI ideation is rarely more than just third-rate, first-thought-worst-thought sci-fi.

also:

people keep on talking about... the near-term dangers of AI but they never come up with any[thing] really interesting"

Given the current public discourse on AI and how it might be exploited to make the working class redundant, this is just Yud telling on himself for the gazillionth time.

also a later tweet:

right that's the danger of LLMs. they don't reason by analogy. they don't reason at all. you just put a computer virus in one end and a DNA virus comes out the other

Well, consider my priors adjusted, Yud correctly identifies that LLMs don't reason, good job my guy. Yet, somehow he believes it's possible that today's LLMs can still spit out viable genetic viruses. Well, last I checked, no one on stack overflow has cracked that one yet.

Actually, if one of us could write that as a stack overflow question, maybe we can spook Yud. That would be fun.

[-] swlabr@awful.systems 11 points 1 year ago* (last edited 1 year ago)

FR: I originally thought this tweet was some weird, boomer anti-snowflake take, like:

In good old days:

Student: Why my compiler no read comment

Teacher: Listen to yourself, you are an idiot

Modern bad day:

Student: Why my compiler no read comment

Teacher: First, are your feelings hurt?

It took me at least a few paragraphs to realise he was talking about talking to an AI.

[-] swlabr@awful.systems 11 points 1 year ago

It takes a real libertarian to be a government informer.

[-] swlabr@awful.systems 11 points 1 year ago

most charitable psychoanalysis: projecting their sense of rationality onto a fictional world is a way to express a deep longing for rules and logic in an often cruelly irrational world

least charitable: their sense of rationality can only be true in a fictional world, so they want to live in that rather than reality

Neutral charity: the author is dead, all interpretation is essentially fanfiction, and since we are all individuals, all relationships with texts/fanfiction are weird.

[-] swlabr@awful.systems 11 points 1 year ago

It’s gotta be a cult programming thing. X happened to you, you learned Y, but that’s incorrect, you should have learned Z, read this 10000 word manuscript, then come to our learning session/poly orgy and we can become less wrong together

[-] swlabr@awful.systems 11 points 1 year ago* (last edited 1 year ago)

It’s only “due diligence” in the lesswrong region of the internet, otherwise it’s just sparkling willful ignorance

view more: ‹ prev next ›

swlabr

joined 2 years ago