608
Anon conserves power (sh.itjust.works)
you are viewing a single comment's thread
view the rest of the comments
[-] LostXOR@fedia.io 65 points 1 week ago

This article estimates that GPT-4 took around 55 GWh of electricity to train. A human needs maybe 2000 kcal (2.3 kWh) a day and lives 75 years, for a lifetime energy consumption of 63 MWh (or 840x less than just training GPT-4).

So not only do shitty "AI" models use >20x the energy of a human to "think," training them uses the lifetime energy equivalent of hundreds of humans. It's absolutely absurd how inefficient this technology is.

[-] Eyekaytee@aussie.zone 14 points 1 week ago

A human needs maybe 2000 kcal (2.3 kWh) a day

Did you just externalise all the other inputs?

load more comments (7 replies)
this post was submitted on 25 Jun 2025
608 points (98.4% liked)

Greentext

6640 readers
1476 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 2 years ago
MODERATORS