[-] dgerard@awful.systems 9 points 2 days ago* (last edited 2 days ago)

Egan already told Hanson to fuck off in a previous story, Zendegi:

"My IQ is one hundred and sixty ... You can always reach me through my blog, Overpowering Falsehood dot com, the number one site for rational thinking about the future —"

16
submitted 3 weeks ago* (last edited 3 weeks ago) by dgerard@awful.systems to c/sneerclub@awful.systems
31
42
25
submitted 1 month ago* (last edited 1 month ago) by dgerard@awful.systems to c/sneerclub@awful.systems

Thinking about how the arsing fuck to explain the rationalists to normal people - especially as they are now a loud public problem along multiple dimensions.

The problem is that it's all deep in the weeds. Every part of it is "it can't be that stupid, you must be explaining it wrong."

With bitcoin, I have, over the years, simplified it to being a story of crooks and con men. The correct answer to "what is a blockchain and how does it work" is "it's a way to move money around out of the sight of regulators" and maybe "so it's for crooks and con men, and a small number of sincere libertarians" and don't even talk about cryptography or technology.

I dunno what the one sentence explanation is of this shit.

"The purpose of LessWrong rationality is for Yudkowsky to live forever as an emulation running on the mind of the AI God" is completely true, is the purpose of the whole thing, and is also WTF.

Maybe that and "so he started what turned into a cult and a series of cults"? At this point I'm piling up the absurdities again.

The Behind The Bastards approach to all these guys has been "wow these guys are all so wacky haha and also they're evil."

How would you first approach explaining this shit past "it can't be that stupid, you must be explaining it wrong"?

[also posted in sneer classic]

16
submitted 1 month ago* (last edited 1 month ago) by dgerard@awful.systems to c/sneerclub@awful.systems

yeah i'm sure Matt Levine, qntm and Wildbow are gonna be champing at the bit to attend wordy racist fest

42
14
11
69
submitted 2 months ago* (last edited 2 months ago) by dgerard@awful.systems to c/sneerclub@awful.systems

While this linear model's overall predictive accuracy barely outperformed random guessing,

I was tempted to write this up for Pivot but fuck giving that blog any sort of publicity.

the rest of the site is a stupendous assortment of a very small field of focus that made this ideal for sneerclub and not just techtakes

29
52
14

the old one is three weeks old, let's start another

previous thread

[-] dgerard@awful.systems 42 points 5 months ago

greatest point against him being a rationalist is the 262 word manifesto, it would be 20,000 words with sections numbered I. II. III. and possibly diagrams

[-] dgerard@awful.systems 36 points 6 months ago

"There was a post, [pause], I forget who wrote it" <- the kind of thing I have said several times attempting to avoid leaking rationalist-evidence-bits.

gotta keep that power level under wraps

[-] dgerard@awful.systems 48 points 6 months ago* (last edited 6 months ago)

Keep in mind that, for theologically conservative (“the Bible is historically and spiritually accurate”) Christians like myself,

rationalists whoo

[-] dgerard@awful.systems 37 points 11 months ago* (last edited 11 months ago)

comment:

I was at Manifest as a volunteer, and I also saw much of the same behaviour as you. If I had known scientific racism or eugenics were acceptable topics of conversation there, I wouldn’t have gone. I’m increasingly glad I decided not to organise a talk.

EA needs to recognise that even associating with scientific racists and eugenicists turns away many of the kinds of bright, kind, ambitious people the movement needs. I am exhausted at having to tell people I am an EA ‘but not one of those ones’. If the movement truly values diversity of views, we should value the people we’re turning away just as much.

my brother in the Acausal Robot God, I have some unfortunate news about what your fellow EAs have espoused for the past twenty years.

Austin from Manifest responds that leftist views would obviously much more damaging to EA than racist ones, because reasons.

[-] dgerard@awful.systems 34 points 11 months ago

and yet you signed up to Awful Systems

[-] dgerard@awful.systems 80 points 11 months ago

Every article on pronatalism in the past few years has been these two Thiel-sponsored white nationalist dweebs and nobody else. Why the fuck does the Guardian fall for this shit too.

[-] dgerard@awful.systems 33 points 1 year ago

I have written about Balaji's amazing brain previously. Wait till you get to the PiTato. He's one of the VCs who was super scared by GPT-3, I think because you could replace his twitter with an LLM and the quality would go up.

[-] dgerard@awful.systems 42 points 1 year ago

tired: polyamory on a basis of mutual love
wired: neoreactionary harem with stack ranking

[-] dgerard@awful.systems 53 points 1 year ago

Catholics, well known for their low fertility

[-] dgerard@awful.systems 58 points 1 year ago

always love* it when these guys get so race scientist they start on different groups of white people

[-] dgerard@awful.systems 34 points 1 year ago

and - a key point - his prospective tradwaifus don't want him

view more: next ›

dgerard

joined 2 years ago
MODERATOR OF