32
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 22 Feb 2026
32 points (68.6% liked)
Asklemmy
53213 readers
666 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 6 years ago
MODERATORS
I get a strong impression that the whole extinction of humanity narrative is really just an astroturf marketing campaign by AI companies. They're basically scaremongering because it gets in the news, and the goal is to convince investors how smart these things are. It's the whole OpenAI claiming they're on the verge of AGI right before pivoting to doing horny chatbots. These are useful tools, and I also use them day to day, but the hype around them is absolutely incredible.
I think we have plenty of real risks to humanity to worry about, like the US starting a nuclear holocaust. We don't need to waste time worrying about imaginary risks like AGI here.
I'd also argue the whole energy consumption argument is very myopic. The reality is that these things have been getting more and more efficient, and there is little reason to think that's not going to be continue being the case going forward. It's completely new tech, and it's basically just moved past proof of concept stages. There's going to be a lot of optimization happening down the road. And even when you contextualize current energy usage, it's not as crazy as people seem to think https://www.simonpcouch.com/blog/2026-01-20-cc-impact/
We're also starting to see stuff like this happening https://www.anuragk.com/blog/posts/Taalas.html
The biggest risk in terms of human extinction is a government allowing an AI to make unchecked military (e.g. nuclear) decisions.
At this point, I'd trust the AI over the clowns running the Burger Reich.
It doesn't look like that energy consumption blog post account for the cost of training the model. Otherwise, it should be telling us how many queries/sessions are assumed to be run over the course of the lifetime of a model.
Models training is a one off effort. Model usage is what matters because that's where energy is used continuously. Also, practically nobody trains models from scratch right now. People use existing base models to tune and extend them.
Training is a continuous expenditure. We're nearly ten years into this craze and we're still continuously pumping out new models. Whether they're trained from scratch or not is immaterial. Both processes still consume energy. If you want to justify the claim that training cost is negligible, you would have to show that this cost is actually going down over time and that it's going down sufficiently quickly.
Whether they're trained from scratch or not is very much material because it takes far more energy to do that. Meanwhile, we consume energy as a civilization in general. And frankly, a lot of energy is consumed on far dumber things like advertisements. If you count all the energy that goes into producing and displaying ads, that dwarfs AI energy use. So, it's kind of weird ti single AI energy use out here.
You know what else takes far less energy than training a single model? One query. Yet, you argue that it's the main contributor to the energy consumption. Why is that? It's because there's a very high volume of them, thus bringing up the total energy consumption. At the end of the day, it's this total energy consumption that matters, not the cost of doing it once. Look at the total energy expenditure of training, not just the cost of doing it once.
We're talking about AI here because that's the topic of this thread. I've never seen anyone say that it's the only problem worth addressing. Plus, if you want to compare energy usage of ads (or anything else) compared to AI, you would first need to know how much energy AI is actually using.
Yes, and my point is that operational cycle of the model dominates total energy consumption. And turns out that it's not actually that high in the grand scheme of things, and continues to improve all the time.
Meanwhile, it's absolutely necessary to contextualize AI energy use in relation to the other ways we use energy to understand whether there's something exceptional happening here or not. All the information for figuring out how much energy AI is using is available. We know how much energy models use, and rough numbers of people using them. So, that's not a big mystery.