Training is a continuous expenditure. We're nearly ten years into this craze and we're still continuously pumping out new models. Whether they're trained from scratch or not is immaterial. Both processes still consume energy. If you want to justify the claim that training cost is negligible, you would have to show that this cost is actually going down over time and that it's going down sufficiently quickly.
It doesn't look like that energy consumption blog post account for the cost of training the model. Otherwise, it should be telling us how many queries/sessions are assumed to be run over the course of the lifetime of a model.
Those 20 years of eating directly serve our primary evolutionary goal that is the continued existence of human beings.
Most of us also have the goal of enjoying our time here. Food also contributes towards that.
I like to keep to the same routine when possible. Birthdays and holidays interrupt that. No good. I can't do much for holidays, but since my birthday is supposed to be my day, i can demand this from everyone around me.
They never claimed that it was the whole thing. Only that it was part of it.
Our local Costco has these. They're the previous day's chicken and are sold at a discount.
How is this untrue? Generative pre-training is literally training the model to predict what might come next in a given text.
We have the term AGI because we sometimes want to communicate something more specific, and AI is too broad of a term.
Turns out Benjamin Franklin had it right, and it was this time traveler that caused him to flip it to the wrong direction.
Academic Authors: $0
FAKE NEWS
This should be in the negatives. We have to pay to get papers published in these traditional journals.
I searched through Lemmy posts with that word. Half of them have people asking the exact same question, and based on the answers, I'm going to conclude that no one knows.
One guess that seems plausible is that it's an AI hallucinated word that's showing up a lot because they're using AI to generate the captions.

You know what else takes far less energy than training a single model? One query. Yet, you argue that it's the main contributor to the energy consumption. Why is that? It's because there's a very high volume of them, thus bringing up the total energy consumption. At the end of the day, it's this total energy consumption that matters, not the cost of doing it once. Look at the total energy expenditure of training, not just the cost of doing it once.
We're talking about AI here because that's the topic of this thread. I've never seen anyone say that it's the only problem worth addressing. Plus, if you want to compare energy usage of ads (or anything else) compared to AI, you would first need to know how much energy AI is actually using.