39
submitted 1 day ago* (last edited 1 day ago) by wolfyvegan@slrpnk.net to c/technology@beehaw.org

Archived copies of the article:

you are viewing a single comment's thread
view the rest of the comments
[-] theangriestbird@beehaw.org 8 points 23 hours ago* (last edited 22 hours ago)

Big article, but a great read! Some key excerpts:

This isn’t simply the norm of a digital world. It’s unique to AI, and a marked departure from Big Tech’s electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers. Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.

Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise. Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram. You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.

One can do some very rough math to estimate the energy impact. In February the AI research firm Epoch AI published an estimate of how much energy is used for a single ChatGPT query—an estimate that, as discussed, makes lots of assumptions that can’t be verified. Still, they calculated about 0.3 watt-hours, or 1,080 joules, per message. This falls in between our estimates for the smallest and largest Meta Llama models (and experts we consulted say that if anything, the real number is likely higher, not lower).

One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, it’d mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAI’s other products, like video generators, and that for all the other AI companies and startups.

But here’s the problem: These estimates don’t capture the near future of how we’ll use AI. In that future, we won’t simply ping AI models with a question or two throughout the day, or have them generate a photo. Instead, leading labs are racing us toward a world where AI “agents” perform tasks for us without our supervising their every move. We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode. We will give complex tasks to so-called “reasoning models” that work through tasks logically but have been found to require 43 times more energy for simple problems, or “deep research” models that spend hours creating reports for us. We will have AI models that are “personalized” by training on our data and preferences.

By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth.

[-] HarkMahlberg@kbin.earth 9 points 22 hours ago

So climate change collapse in the next 10 years, rather than the next 30. Great, thanks AI. I hope you all have fun reasoning your way around keeping the power plants and data centers running through Category 7 hurricanes and wildfires the size of Pennsylvania, without any humans to clean up the damage.

[-] pezhore@infosec.pub 5 points 19 hours ago

No, see, if we just give it all the energy, burn our skies and boil our oceans to make AI better, then it'll for sure tell us how to unfuck everything.

/S

this post was submitted on 20 May 2025
39 points (100.0% liked)

Technology

38677 readers
343 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS