Saying anything in particular makes you open to fact-based criticism — I mean, it is object-level and bad, instead of meta-level and good.
Langan himself has ~~showed up~~ graced the comment section with his benevolent presence!
The CTMU is not conjectural, but a lock. So as much as I'd like to humbly efface myself in an outpouring of false modesty, I'll merely point out that arguing with the CTMU amounts to undermining one's own argumentation, whatever it may be. People have been trying to get over on the CTMU for the last 35 or so years, and not one has ever gotten to first base. This was not an accident. If you think you see a mistake or critical inadequacy, the mistake and the inadequacy are almost certainly yours.
Too long? Not long enough? Either way, every day is somehow worse than the one before.
"Because, Lana, I care about the fluffiness of my baked goods."
In an essay that somehow manages to offhandendly mention both evolutionary psychology and hentai anime in the same paragraph.
I doubted whether it would be a good use of time to read Michael Lewis’s new book Going Infinite about Sam Bankman-Fried (hereafter SBF or Sam). What would I learn that I did not already know? Was Michael Lewis so far in the tank of SBF that the book was filled with nonsense and not to be trusted?
I set up a prediction market,
10/10 perfect LessWrong, no notes
all alignment and no play makes jack a dull boy
Buh-bye now.
A joke I heard in the last century: Give a professor a nickel and they'll talk for an hour. Give 'em a quarter and you'll be in real trouble.
I will try to have some more comments about the physics when I have time and energy. In the meanwhile:
Entropy in thermodynamics is not actually a hard concept. It's the ratio of the size of a heat flow to the temperature at which that flow is happening. (So, joules per kelvin, if you're using SI units.) See episodes 46 and 47 of The Mechanical Universe for the old-school PBS treatment of the story. The last time I taught thermodynamics for undergraduates, we used Finn's Thermal Physics, for the sophisticated reason that the previous professor used Finn's Thermal Physics.
Entropy in information theory is also not actually that hard of a concept. It's a numerical measure of how spread-out a probability distribution is.
It's relating the two meanings that is tricky and subtle. The big picture is something like this: A microstate is a complete specification of the positions and momenta of all the pieces of a system. We can consider a probability distribution over all the possible microstates, and then do information theory to that. This bridges the two definitions, if we are very careful about it. One thing that trips people up (particularly if they got poisoned by pop-science oversimplifications about "disorder" first) is forgetting the momentum part. We have to consider probabilities, not just for where the pieces are, but also for how they are moving. I suspect that this is among Vopson's many problems. Either he doesn't get it, or he's not capable of writing clearly enough to explain it.
I have never heard of anything important being published there. I think it's the kind of journal where one submits a paper after it has been rejected by one's first and second (and possibly third) choices.
Oh, it's worse than "outlandish". It's nonsensical. He's basically operating at a level of "there's an E in this formula and an E in this other formula, so I will set them equal and declare it revolutionary new physics".
Here's a passage from the second paragraph of the 2023 paper:
wat
Storing a message in a system doesn't make new microstates. How could it? You're just rearranging the pieces to spell out a message — selecting those microstates that are consistent with that message. Choosing from a list of available options doesn't magically add new options to the list.