11
submitted 2 days ago* (last edited 1 day ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems

Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called 'Death and the Gorgon' by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real 'steelman' airtime to the subjects/subjects it is being negative about. But well, he doesn't have to, he isn't the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

you are viewing a single comment's thread
view the rest of the comments
[-] bencurthoys@mastodon.social 3 points 11 hours ago

@Amoeba_Girl @Soyweiser I read https://en.wikipedia.org/wiki/Permutation_City and found something about it seemed deeply wrong in ways that I had trouble articulating.

It's like when you see a bogus mathematical proof of a statement that you know to be false, but the mistake is hidden deep and you can't tell where it has gone wrong, you just know it has.

[-] cstross@wandering.shop 4 points 11 hours ago

@bencurthoys @Amoeba_Girl @Soyweiser I'm pretty sure that about 10-20 years ago Egan came out with a serious repudiation of his own ideas about achieving AI through iterated simulations of less-intelligent entities: he noted that implementing it was implicitly genocidal (by murdering all entities that didn't *quite* meet some threshold set by the experimenters, you'd inevitably kill huge numbers of sentient beings just for failing an arbitrary test).

[-] blakestacey@awful.systems 2 points 4 hours ago

I vaguely recalled a statement of his to that effect and found one here:

What I regret most [about Permutation City] is my uncritical treatment of the idea of allowing intelligent life to evolve in the Autoverse. Sure, this is a common science-fictional idea, but when I thought about it properly (some years after the book was published), I realised that anyone who actually did this would have to be utterly morally bankrupt. To get from micro-organisms to intelligent life this way would involve an immense amount of suffering, with billions of sentient creatures living, struggling and dying along the way. Yes, this happened to our own ancestors, but that doesn’t give us the right to inflict the same kind of suffering on anyone else.

This is potentially an important issue in the real world. It might not be long before people are seriously trying to “evolve” artificial intelligence in their computers. Now, it’s one thing to use genetic algorithms to come up with various specialised programs that perform simple tasks, but to “breed”, assess, and kill millions of sentient programs would be an abomination. If the first AI was created that way, it would have every right to despise its creators.

He even wrote a story on that theme, "Crystal Nights".

[-] bencurthoys@mastodon.social 5 points 10 hours ago* (last edited 9 hours ago)

@cstross @Amoeba_Girl @Soyweiser My usual handle when playing online games is "Bickel", because I happened to be re-reading "Destination: Void" at the time that I first signed up my World Of Warcraft account, and killing huge numbers of sentient beings in the pursuit of artificial consciousness was definitely not a problem for Frank Herbert =)

[-] Amoeba_Girl@awful.systems 4 points 9 hours ago

Herbert is so obsessed with his particular vision of eugenics it ends up back being endearing. Look at our big boy building his big torture worlds just so they can roundaboutly excrete one superman. Such a specific, endlessly restated fetish.

[-] gerikson@awful.systems 4 points 10 hours ago

There's a fun/horrifying scene in Ken McLeod's Stone Canal where the protagonists revive superhuman intelligences from cold storage, get the answers they need from them, then destroy them with nanotech the superhumans have not developed defenses against. As one of them says when confronted: "standard programming practice, keep the source code, blow away the object code".

(It's partially justified that if left alone the superintelligences will just iteratively bootstrap themselves into catatonic insanity anyway)

this post was submitted on 13 May 2025
11 points (100.0% liked)

SneerClub

1095 readers
22 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

founded 2 years ago
MODERATORS