I think Eliezer Yudkowsky & many posts on LessWrong are failing at keeping things concise and to the point.
The replies: "Kolmogorov complexity", "Pareto frontier", "reference class".
It occurs to me that, intentionally or not, he's probably steering TESCREAL types to Wikipedia itself as well. I wouldn't be surprised if accounts were coming out of the woodwork to post multi-kiloword screeds about Wikipedia being soooo unfairrrr....
The Singularity (of hating that we know what those words mean) Is Near
Mastodon has Reply Guys. Lemmy has Cater To Me Whilst I Am Literally, Not Figuratively, Taking a Shit Guys.
"Computational complexity does not work that way!" is one of those TESCREAL-zone topics that I wish I had better reading recommendations for.
Quoted for posterity/convenience:
in a world of greater legibility, romantic partners would have the conversation about "I'd trade up if I found somebody 10%/25%/125% better than you" in advance, and make sure they have common knowledge of the numbers
(Marriage makes sense as a promise not to do that period; but if so, you want to make sure that both partners are on the same page about that. Not everyone assumes that marriage means that.)
Her: I am never, ever letting you go unless I find someone 75% better. Me: Works for me.
oh hello there Performative Allistic Twitter
I suspect that this is less about using language with which one's audience is familiar to convey a message accurately, and more about making the message sound obviously right and affirming the smartness of the audience because Computer Words.
my "not a cult" T-shirt has raised many questions, etc.
I just can't respect a man who is posturing and arrogant yet still fails to go for the phrasing "to whom you are speaking".
The opening line is... certainly a phrase.
I have been working on a research project into the scale, tractability and neglectedness of child marriage.
Later:
Some studies even showed that child marriage was associated with more positive outcomes, such as higher contraceptive use
Ummmmmmmmmm
Suppose you say that you’re 99.99% confident that 2 + 2 = 4.
Then you're a dillbrain.
Then you have just asserted that you could make 10,000 independent statements, in which you repose equal confidence, and be wrong, on average, around once. Maybe for 2 + 2 = 4 this extraordinary degree of confidence would be possible
Yes, how extraordinary that I can say every day that the guy in front of me at the bodega won't win the Powerball. Or that [SystemRandom().random() >= 0.9999 for i in range(10000)]
makes a list that is False
in all but one spot.
P(x|y) is defined as P(x,y)/P(y). P(A|A) is defined as P(A,A)/P(A) = P(A)/P(A) = 1. The ratio of these two probabilities may be 1, but I deny that there's any actual probability that's equal to 1. P(|) is a mere notational convenience, nothing more.
No, you kneebiter.