Considering that the idea of the singularity of AGI was the exponential function going straight up, I don't think this persons understands the problem. Lol, LMAO foomed the scorpion.
(Also that is some gross weird eugenics shit).
E: also isn't IQ a number that gets regraded every now and then with an common upper bound of 160? I know the whole post is more intended as vaguely eugenics aspirational but still.
Anyway, time to start the lucrative field of HighIQHuman safety research. What do we do if the eugenics superhumans goals don't align with humanity?
I have joked before how people really into stoicism tend to be quite emotional and even a risky, as stoicism always seems to be aspirational and doesnt describe the stoic fans behaviour (a good example is the yter Sargon), but this might be a bit of an extreme example.