[-] TinyTimmyTokyo@awful.systems 6 points 1 month ago

Scott talks a bit about it in the video, but he was recently in the news as the guy who refused to sign a non-disparagement agreement when he left OpenAI, which caused them to claw back his stock options.

[-] TinyTimmyTokyo@awful.systems 6 points 1 month ago

I'm fascinated by the way they're hyping up Daniel Kokotajlo to be some sort of AI prophet. Scott does it here, but so does Caroline Jeanmaire in the OP's twitter link. It's like they all got the talking point (probably from Scott) that Daniel is the new guru. Perhaps they're trying to anoint someone less off-putting and awkward than Yud. (This is also the first time I've ever seen Scott on video, and he definitely gives off a weird vibe.)

[-] TinyTimmyTokyo@awful.systems 8 points 1 month ago

This commenter may be saying something we already knew, but it's nice to have the confirmation that Anthropic is chock full of EAs:

(I work at Anthropic, though I don't claim any particular insight into the views of the cofounders. For my part I'll say that I identify as an EA, know many other employees who do, get enormous amounts of value from the EA community, and think Anthropic is vastly more EA-flavored than almost any other large company, though it is vastly less EA-flavored than, like, actual EA orgs. I think the quotes in the paragraph of the Wired article give a pretty misleading picture of Anthropic when taken in isolation and I wouldn't personally have said them, but I think "a journalist goes through your public statements looking for the most damning or hypocritical things you've ever said out of context" is an incredibly tricky situation to come out of looking good and many of the comments here seem a bit uncharitable given that.)

[-] TinyTimmyTokyo@awful.systems 7 points 3 months ago

Lots of discussion on the orange site post about this today.

(I mentioned this in the other sneerclub thread on the topic but reposted it here since this seems to be the more active discussion zone for the topic.)

[-] TinyTimmyTokyo@awful.systems 6 points 10 months ago

But will my insurance cover a visit to Dr. Spicy Autocomplete?

[-] TinyTimmyTokyo@awful.systems 8 points 1 year ago

Glowfic feels like a writing format designed in a lab to be the perfect channel for Eliezer's literary diarrhea.

[-] TinyTimmyTokyo@awful.systems 8 points 1 year ago

Exactly. It would be easier to take Scott's argument more seriously if it wasn't coming from the very same person who previously labeled as unstable and thereby non-credible a woman who accused his rationalist buddies of sexual harrassment -- a woman who, by the way, went on to die by suicide.

So fuck him and his contrived rationalizations.

[-] TinyTimmyTokyo@awful.systems 7 points 1 year ago

"Fucking probabilities, how do they work?"

[-] TinyTimmyTokyo@awful.systems 6 points 2 years ago

I like the way he thinks the lack of punctuation in his "joke" is the tell that it's a joke.

He's also apparently never heard the aphorism that if you have to explain the joke, it's probably not that funny.

[-] TinyTimmyTokyo@awful.systems 7 points 2 years ago

My attention span is not what it used to be, and I couldn't force myself to get to the end of this. A summary or TLDR (on the part of the original author) would have been helpful.

What is it with rationalists and their inability to write with concision? Is there a gene for bloviation that also predisposes them to the cult? Or are they all just mimicking Yud's irritating style?

[-] TinyTimmyTokyo@awful.systems 7 points 2 years ago

What's it like to be so good at PR?

[-] TinyTimmyTokyo@awful.systems 6 points 2 years ago* (last edited 2 years ago)

This is good:

Take the sequence {1,2,3,4,x}. What should x be? Only someone who is clueless about induction would answer 5 as if it were the only answer (see Goodman’s problem in a philosophy textbook or ask your closest Fat Tony) [Note: We can also apply here Wittgenstein’s rule-following problem, which states that any of an infinite number of functions is compatible with any finite sequence. Source: Paul Bogossian]. Not only clueless, but obedient enough to want to think in a certain way.

Also this:

If, as psychologists show, MDs and academics tend to have a higher “IQ” that is slightly informative (higher, but on a noisy average), it is largely because to get into schools you need to score on a test similar to “IQ”. The mere presence of such a filter increases the visible mean and lower the visible variance. Probability and statistics confuse fools.

And:

If someone came up w/a numerical“Well Being Quotient” WBQ or “Sleep Quotient”, SQ, trying to mimic temperature or a physical quantity, you’d find it absurd. But put enough academics w/physics envy and race hatred on it and it will become an official measure.

view more: ‹ prev next ›

TinyTimmyTokyo

joined 2 years ago