I like to think Fermi had it down and we just are really hesitant to embrace the whole conjecture of the great filter. As each day passes, I find more evidence to support that the sole purpose of intelligent life is for it to become intelligent enough to destroy itself.
Maybe there's a limit, like with size. You can't have creatures that are too big, because at some point they just die under their own weight. Maybe you also can't have creatures that are too smart, because at some point they just die because of their own hubris.
I like to think Fermi had it down and we just are really hesitant to embrace the whole conjecture of the great filter. As each day passes, I find more evidence to support that the sole purpose of intelligent life is for it to become intelligent enough to destroy itself.
I like this. We as a species avoided nuclear fire just to go the slow cooker route.
going nuclear is still an option
Maybe there's a limit, like with size. You can't have creatures that are too big, because at some point they just die under their own weight. Maybe you also can't have creatures that are too smart, because at some point they just die because of their own hubris.