10
From Stockholm Impact/Watch 2023 here's a must watch:
(www.youtube.com)
We have moved to https://lemm.ee/c/collapse -- please adjust your subscriptions
This is the place for discussing the potential collapse of modern civilization and the environment.
Collapse, in this context, refers to the significant loss of an established level or complexity towards a much simpler state. It can occur differently within many areas, orderly or chaotically, and be willing or unwilling. It does not necessarily imply human extinction or a singular, global event. Although, the longer the duration, the more it resembles a ‘decline’ instead of collapse.
RULES
1 - Remember the human
2 - Link posts should come from a reputable source
3 - All opinions are allowed but discussion must be in good faith.
4 - No low effort posts.
Related lemmys:
"Metacrisis" huh. As a name for the thing I like "the Long Emergency" better. AI in its currently-existing forms is important only in that it's yet another refinement that can have some use in enhancing the efficiency of the systems we've been building up for the past few hundreds of years, but we are at the point where the diminishing returns from such novelties will no longer be effective in staving off disaster for any appreciable length of time. Large language models are already passé.
I find 'metacrisis' more descriptive and satisfying for the reasons Daniel talked about in the video - that it's not just the many crises we face, but the underlying systems that are creating the crises (ie, Moloch). Also, it doesn't matter if AI is not effective at staving off disaster, as long as it creates value for the market it will be deployed with mind-boggling scale and resource use even as the world burns.
I think this speaker and the people pushing the term "metacrisis" in general misunderstand humanity's predicament in a way that leads them to rate too highly the potential of those AI systems that have been made so far (that we know of.) It's interesting technology, but its potential threats are even more over-hyped lately than its potential benefits. We have better things to do than worry too much about either.
What do you think he's getting wrong about our predicament? AI wasn't really a focus in this talk, just showing how we knowingly develop dangerous tools and act against our collective interests because of our system of incentives and multipolar traps.
I just don't think that the "exponential tech curve" is all that exponential or all that relevant a factor compared to for example the pretty low-tech way in which we're burning ungodly quantities of fossil fuels and using the energy thus produced to eat the whole planet. It's not only AI that I think is over-hyped, it's many of the things I saw when scanning through the video transcript. Finely-tuned supply chains, genetically modified crops, ridiculous financial system fuckery, and other such things are increasingly required to keep it all barely chugging along, but it seems to me that they and "Tech" in general are not the cause of or the solution to our problems unless you go back to technology and modes of social organization invented in the 19th century and before. Crooked Timber: "As Cosma said, the true Singularity began two centuries ago at the commencement of the Long Industrial Revolution."
But then again a substantial part of my reaction was prompted by things I read on searxing the word "metacrisis", so perhaps not entirely fair to this video.
Fair enough all good points, but the reason we're all here beyond the natural carrying capacity and eating the planet is because of the exponential tech curve (Haber-Bosch and others) that we've been in since discovering fossil fuels. If the energy is there, we will use it to grow at the detriment to everything else, it's in our nature. If we somehow manage to complete the green energy transition that's probably even worse for our long term survival, because instead of running out of accessible fossil fuels and being forced to degrow, we'll keep the growth machine running and accelerate this mass extinction and knock down the rest of the planetary boundaries. All new technologies will allow us to increase the scale of our impacts to the planet.
Thanks for the link, I haven't read it yet but it looks interesting.
Yeah it's the "tech curve" being exponential that I don't see happening. The Haber-Bosch process (after the end of the 19th century, but only by a few years) was revolutionary, and a fine example of the kind of rapid increase in our ability to exploit the hell out of everything that hasn't been happening so much lately. The increase in technologically-enabled power may have started looking exponential at a certain point, but I don't believe it has continued like that all the way to the present. The gains today are more incremental, less momentous. The rise and fall of Moore's law shows a similar pattern in microcosm: Great new invention, rapid improvement, exponential growth that people assume will last forever, then its limits are approached and further progress in that direction is slow and complicated. When I try to imagine how the total curve of technological power has gone I can't avoid the impression that its rate of growth topped out somewhere mid 20th century at the latest.
Sure it'd be disastrous if we somehow kept up for any great length of time the exponential growth in energy use, production, and population even without fossil fuels, but the idea that this might actually happen starting from this level seems more like a techno-optimist fantasy than any kind of realistic scenario worth considering. Like some other ideas of the Consilience Project it seems to me as if it might be more relevant to some future post-collapse civilisation that needs to avoid making the same mistakes that will bring down ours.
I admit I'm on the verge of losing sight of the overall point of this thread. But thinking about it more, I will add that looking at the actual tech curve itself may not be that important, depending on where you draw the line between technology and capability. For example, it may not matter that increase in transistor density is slowing down when global total compute keeps increasing exponentially. Further, how would quantum computing factor into this (the movement in the cryptography space suggests that a post-quantum world is imminent). On the topic of LLMs, would it matter if those stagnate while the ability of companies and states to manipulate us and drown us in misinformation keeps growing exponentially? And how would the advent of AGI factor into this - in some ways that would be the last invention we have to make ourselves. I guess the point is that some advancements, even ones that are just incremental, seem to have an outsized effect on our ability to impact the world around us.
Anyway, I read the article you linked and enjoyed it. It reminded me a bit of Meditations On Moloch, also a good read if you haven't seen it yet, attempting to explain the behaviors of civilizations.
That there's a difference between transistor density and total compute capacity does seem crucial. We can and often do have more microprocessors in the world, and more automobiles, more coal mines, more injection-molded plastic, more batteries, more bombs, more of everything without any improvement in technology whatsoever. Just more of the same, mass-produced by the ever-expanding machine. If there does exist some curve that measures overall tech progress in a useful way, the curve which measures its applications and their effects is necessarily offset from it in time and need not follow its shape. It seems natural to expect our thrust into ecological overshoot to continue for some time after the big impulse that pushed us into it has begun to fade. Much like the "macabre whale analogy" from Scott Alexander there, in which we're enjoying the results of a whalefall.
On a more optimistic note I think history demonstrates that art, music, and love are not so easily done away with although I'm not so sure about philosophy. Scarcity is going to make a comeback, and the results will be as unpredictable as anything people imagine AGI might do if it does actually arrive some day. I suspect we should ideally aim to make some changes to what is normally thought of as "human nature" in order to avoid the worst outcomes, but that they need not be larger ones than those that have happened in the past. But who the hell knows really.