27
submitted 13 hours ago* (last edited 13 hours ago) by wittycomputer@feddit.org to c/asklemmy@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] howrar@lemmy.ca 2 points 10 hours ago

It doesn't look like that energy consumption blog post account for the cost of training the model. Otherwise, it should be telling us how many queries/sessions are assumed to be run over the course of the lifetime of a model.

[-] yogthos@lemmy.ml 3 points 9 hours ago

Models training is a one off effort. Model usage is what matters because that's where energy is used continuously. Also, practically nobody trains models from scratch right now. People use existing base models to tune and extend them.

[-] howrar@lemmy.ca 2 points 9 hours ago

Training is a continuous expenditure. We're nearly ten years into this craze and we're still continuously pumping out new models. Whether they're trained from scratch or not is immaterial. Both processes still consume energy. If you want to justify the claim that training cost is negligible, you would have to show that this cost is actually going down over time and that it's going down sufficiently quickly.

[-] yogthos@lemmy.ml 3 points 9 hours ago* (last edited 9 hours ago)

Whether they're trained from scratch or not is very much material because it takes far more energy to do that. Meanwhile, we consume energy as a civilization in general. And frankly, a lot of energy is consumed on far dumber things like advertisements. If you count all the energy that goes into producing and displaying ads, that dwarfs AI energy use. So, it's kind of weird ti single AI energy use out here.

[-] howrar@lemmy.ca 2 points 7 hours ago

You know what else takes far less energy than training a single model? One query. Yet, you argue that it's the main contributor to the energy consumption. Why is that? It's because there's a very high volume of them, thus bringing up the total energy consumption. At the end of the day, it's this total energy consumption that matters, not the cost of doing it once. Look at the total energy expenditure of training, not just the cost of doing it once.

So, it's kind of weird t0 single AI energy use out here as some form of exceptional evil.

We're talking about AI here because that's the topic of this thread. I've never seen anyone say that it's the only problem worth addressing. Plus, if you want to compare energy usage of ads (or anything else) compared to AI, you would first need to know how much energy AI is actually using.

[-] yogthos@lemmy.ml 3 points 7 hours ago

Yes, and my point is that operational cycle of the model dominates total energy consumption. And turns out that it's not actually that high in the grand scheme of things, and continues to improve all the time.

Meanwhile, it's absolutely necessary to contextualize AI energy use in relation to the other ways we use energy to understand whether there's something exceptional happening here or not. All the information for figuring out how much energy AI is using is available. We know how much energy models use, and rough numbers of people using them. So, that's not a big mystery.

this post was submitted on 22 Feb 2026
27 points (68.5% liked)

Asklemmy

53213 readers
663 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 6 years ago
MODERATORS