Interesting quantitative look at web performance and how designs made for people with high-end devices can be practically unusable for people on low-end devices, which disproportionately affects poorer people and people in developing countries. Also discusses how sites game Google's performance metrics—maybe not news to the web devs among ye, but it was new to me. The arrogance of the Discourse founder was astounding.
RETVRN to static web pages.^[Although even static web pages can be fraught—see his other post on speeding up his site 50x by tearing out a bunch of unnecessary crap.]
Also, from one of the appendices:
In principle, HN should be the slowest social media site or link aggregator because it's written in a custom Lisp that isn't highly optimized and the code was originally written with brevity and cleverness in mind, which generally gives you fairly poor performance. However, that's only poor relative to what you'd get if you were writing high-performance code, which is not a relevant point of comparison here.
Happily, we do have a nice (read-only) static version of Hexbear coded up by our very own @kota@hexbear.net: diethex.net! Here's the announcement post with more details; it's also linked in the sidebar on the home page. Funnily enough, in said announcement post someone links to an article which discusses the very blog post I posted here, so we've come full circle!
(also yes, kota is aware that spoilers don't currently work)
Thank you for the wonderful and informative reply!