[-] lurker@awful.systems 3 points 5 days ago

Damn, she went in on Yud (not that he doesn’t deserve it)

I also found this article about the doc in the replies of that post https://buttondown.com/maiht3k/archive/a-tale-of-two-ai-documentaries/ and it’s a very interesting read

[-] lurker@awful.systems 3 points 6 days ago* (last edited 6 days ago)

I took a deeper look into the documentary, and it does go into both the pessimist and optimist perspectives, so their inclusion makes more sense. and yeah, I was trying to get at how they're skeptical of the TESCREAL stuff and of current LLM capabilities

[-] lurker@awful.systems 3 points 6 days ago* (last edited 6 days ago)

I poked around the IMDB page, and there are reviews! currently it's sitting at a 8.5/10 with 31 ratings (though no written reviews it seems like) the metacritic score is a 51/100 with 4 reviews and there are 4 external reviews

[-] lurker@awful.systems 3 points 6 days ago* (last edited 6 days ago)

Sam Altman and the other CEOS being there is such a joke “this technology is so dangerous guys! of course I’m gonna keep blocking regulation for it, I need to make money after all!” Also, I’m shocked Emily Bender and Timmit Gebru are there, aren’t they AI skeptics?

[-] lurker@awful.systems 2 points 6 days ago

Surprised it’s a term they stole and not one they made up. But yeah the whole idea of “AGI will solve all our problems” is just silly

[-] lurker@awful.systems 4 points 6 days ago

what’s the lore with Tyrell?

[-] lurker@awful.systems 5 points 6 days ago

my personal guess is that “apocaloptimist” is just them trying to make a “better” term for “pessimist”

[-] lurker@awful.systems 14 points 1 week ago* (last edited 1 week ago)

Eliezer calling himself genre savy and above tropes as an actual serious coping mechanism is simply too good to not bring back up. the weirdest way to deny being depressed I've seen

12
submitted 1 week ago* (last edited 1 week ago) by lurker@awful.systems to c/sneerclub@awful.systems

this was already posted on reddit sneerclub, but I decided to crosspost it here so you guys wouldn’t miss out on Yudkowsky calling himself a genre savy character, and him taking what appears to be a shot at the Zizzians

[-] lurker@awful.systems 13 points 2 weeks ago

"It’s horrifying to see normal people you dance with turn into nazis so easily."

..you're calling the people who hate the billionaires nazis? have you seen what the average billionaire has gotten up to recently???

[-] lurker@awful.systems 15 points 2 weeks ago

they actually meant "tip your landlord into the nearest garbage bin"

[-] lurker@awful.systems 15 points 2 weeks ago

bootlicking billionaires when they're the main ones supporting the thing you say is an existential threat is definitely a choice. rationalists seem to be getting more and more mask off in the face of the trump administration

27
submitted 3 weeks ago* (last edited 3 weeks ago) by lurker@awful.systems to c/sneerclub@awful.systems

originally posted in the thread for sneers not worth a whole post, then I changed my mind and decided it is worth a whole post, cause it is pretty damn important

Posted on r/HPMOR roughly one day ago

full transcript:

Epstein asked to call during a fundraiser. My notes say that I tried to explain AI alignment principles and difficulty to him (presumably in the same way I always would) and that he did not seem to be getting it very much. Others at MIRI say (I do not remember myself / have not myself checked the records) that Epstein then offered MIRI $300K; which made it worth MIRI's while to figure out whether Epstein was an actual bad guy versus random witchhunted guy, and ask if there was a reasonable path to accepting his donations causing harm; and the upshot was that MIRI decided not to take donations from him. I think/recall that it did not seem worthwhile to do a whole diligence thing about this Epstein guy before we knew whether he was offering significant funding in the first place, and then he did, and then MIRI people looked further, and then (I am told) MIRI turned him down.

Epstein threw money at quite a lot of scientists and I expect a majority of them did not have a clue. It's not standard practice among nonprofits to run diligence on donors, and in fact I don't think it should be. Diligence is costly in executive attention, it is relatively rare that a major donor is using your acceptance of donations to get social cover for an island-based extortion operation, and this kind of scrutiny is more efficiently centralized by having professional law enforcement do it than by distributing it across thousands of nonprofits.

In 2009, MIRI (then SIAI) was a fiscal sponsor for an open-source project (that is, we extended our nonprofit status to the project, so they could accept donations on a tax-exempt basis, having determined ourselves that their purpose was a charitable one related to our mission) and they got $50K from Epstein. Nobody at SIAI noticed the name, and since it wasn't a donation aimed at SIAI itself, we did not run major-donor relations about it.

This reply has not been approved by MIRI / carefully fact-checked, it is just off the top of my own head.

[-] lurker@awful.systems 14 points 3 weeks ago

jesus fucking christ the depravity never ends. and there's still three million more files of this bullshit

34

I searched for “eugenics” on yud’s xcancel (i will never use twitter, fuck you elongated muskrat) because I was bored, got flashbanged by this gem. yud, genuinely what are you talking about

view more: next ›

lurker

joined 4 weeks ago