[-] dgerard@awful.systems 13 points 10 months ago

sadly that was a fork that went nowhere

[-] dgerard@awful.systems 13 points 1 year ago

The key difference is that in previous AI springs, the customer was the DoD, and winter set in when they declined to set more money on fire for an approach that wasn't working.

[-] dgerard@awful.systems 13 points 1 year ago* (last edited 1 year ago)

The services and infrastructure: hosting a web forum

edit: sorry, three web forums

[-] dgerard@awful.systems 13 points 1 year ago

"now do arbitrary homework for me"

[-] dgerard@awful.systems 13 points 1 year ago

Roses are red
Statists are fools
It's such fucking bullshit
I can't live near schools

[-] dgerard@awful.systems 13 points 1 year ago

straight out of Caroline's tumblr

[-] dgerard@awful.systems 13 points 1 year ago

the comments are even dumber

[-] dgerard@awful.systems 13 points 1 year ago

a whore on stage, a comedian in bed

[-] dgerard@awful.systems 13 points 2 years ago

EA: if a neoreactionary looked up "charity" in a dictionary, often literally

[-] dgerard@awful.systems 13 points 2 years ago

that is precisely it, yes. they get called out on this in the comments here

[-] dgerard@awful.systems 13 points 2 years ago

they would wholeheartedly embrace it in propoprtion to how awful the conclusion was

39
25
13
19
20
21
4
3
3

yes really, that’s literally the title of the post. (archive copy, older archive copy) LessWrong goes full Motte.

this was originally a LW front-page post, and was demoted to personal blog when it proved unpopular. it peaked at +10, dropped to -6 and is +17 right now.

but if anyone tries to make out this isn’t a normative rationalist: this guy, Michael “Valentine” Smith, is a cofounder of CFAR (the Center for Applied Rationality), a LessWrong offshoot that started being about how to do rational thinking … and finally admitted it was about “AI Risk”

this post is the Rationalist brain boys, the same guys who did FTX and Effective Altruism, going full IQ-Anon wondering how the market could fail so badly as not to care what weird disaster assholes think. this is the real Basilisk.

when they’re not spending charity money on buying themselves castles, this is what concerns the modern rationalist

several commenters answered “uh, the customers.” and tried to explain the concept of markets to OP, and how corporations like selling stuff to normal people and not just to barely-crypto-fash. they were duly downvoted to -20 by valiant culture warriors who weren’t putting up with that sort of SJW nonsense.

comment by author, who thinks “hard woke” is not only a thing, but a thing that profit-making corporations do so as not to make a profit: “For what it’s worth, I wouldn’t describe myself as leaning right.” lol ok dude

right-wingers really don’t believe in, or even understand, capitalism or markets at all. they believe in hierarchy. that’s what’s offended this dipshit.

now, you might think LessWrong Rationalists, Slate Star Codex readers, etc. tend towards behaving functionally indistinguishably from Nazis, but that’s only because they work so hard at learning from their neoreactionary comrades to reach that stage

why say in 10,000 words what you can say in 14

1
submitted 2 years ago* (last edited 2 years ago) by dgerard@awful.systems to c/sneerclub@awful.systems

Video games also have potential legal advantages over IQ tests for companies. You could argue that "we only hire people good at video games to get people who fit our corporate culture of liking video games" but that argument doesn't work as well for IQ tests.

yet again an original post title that self-sneers

1
Universal Watchtowers (awful.systems)
submitted 2 years ago* (last edited 2 years ago) by dgerard@awful.systems to c/sneerclub@awful.systems

by Monkeon, from the b3ta Mundane Video Games challenge

2
view more: ‹ prev next ›

dgerard

joined 2 years ago
MODERATOR OF