Pure speculation: my guess is that an “apocaloptimist” is just someone fully bought into all of the rationalist AI delulu. Specifically:
- AGI is possible
- AGI will solve all our current problems
- A future where AGI ends humanity is possible/probable
and they take the extra belief, steeped in the grand tradition of liberal optimism, that we will solve the alignment problem and everything will be ok. Again, just guessing here.
I feel like I nailed my guess