276
submitted 8 months ago by silence7@slrpnk.net to c/climate@slrpnk.net
all 34 comments
sorted by: hot top controversial new old
[-] Zaktor@sopuli.xyz 19 points 8 months ago* (last edited 8 months ago)

de Vries, who now works for the Netherlands’ central bank, estimated that if Google were to integrate generative A.I. into every search, its electricity use would rise to something like twenty-nine billion kilowatt-hours per year. This is more than is consumed by many countries, including Kenya, Guatemala, and Croatia.

Why on earth would they do that? Just cache the common questions.

It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)

Ok, so the actual real world estimate is somewhere on the order of a million kilowatt-hours, for the entire globe. Even if we assume that's just US, there are 125M households, so that's 4 watt-hours per household per day. A LED lightbulb consumes 8 watts. Turn one of those off for a half-hour and you've balanced out one household's worth of ChatGPT energy use.

This feels very much in the "turn off your lights to do you part for climate change" distraction from industry and air travel. They've mixed and matched units in their comparisons to make it seem like this is a massive amount of electricity, but it's basically irrelevant. Even the big AI-every-search number only works out to 0.6 kwh/day (again, if all search was only done by Americans), which isn't great, but is still on the order of don't spend hours watching a big screen TV or playing on a gaming computer, and compares to the 29 kwh already spent.

Math, because this result is so irrelevant it feels like I've done something wrong:

  • 500,000 kwh/day / 125,000,000 US households = 0.004 kwh/household/day
  • 29,000,000,000 kwh/yr / 365 days/yr / 125,000,000 households = 0.6 kwh/household/day, compared to 29 kwh base
[-] kibiz0r@midwest.social 2 points 8 months ago

Just cache the common questions.

There are only two hard things in Computer Science: cache invalidation and naming things.

[-] boonhet@lemm.ee 4 points 8 months ago

You mean: two hard things - cache invalidation, naming things and off-by-one errors

[-] kibiz0r@midwest.social 3 points 8 months ago

Reminds me of the two hard things in distributed systems:

  • 2: Exactly-once delivery
  • 1: Guaranteed order
  • 2: Exactly-once delivery
[-] Zaktor@sopuli.xyz 3 points 8 months ago

It's a good thing that Google has a massive pre-existing business about caching and updating search responses then. The naming things side of their business could probably use some more work though.

[-] blanketswithsmallpox@lemmy.world 1 points 8 months ago

I'm glad someone was on the same track as me. I posted numbers as well if you want to take a peak at mine below.

[-] frezik@midwest.social 1 points 8 months ago

Just cache the common questions.

AI models work in a feedback loop. The fact that you're asking the question becomes part of the response next time. They could cache it, but the model is worse off for it.

Also, they are Google/Microsoft/OpenAI. They will do it because they can and nobody is stopping them.

[-] Zaktor@sopuli.xyz 1 points 8 months ago

This is AI for search, not AI as a chatbot. And in the search context many requests are functionally similar and can have the same response. You can extract a theme to create contextual breadcrumbs that will be effectively the same as other people doing similar things. People looking for Thai food in Los Angeles will generally follow similar patterns and need similar responses, even if it comes in the form of several successive searches framed as sentences with different word ordering and choices.

And none of this is updating the model (at least not in a real-time sense that would require re-running a cached search), it's all short-term context fed in as additional inputs.

[-] mindbleach@sh.itjust.works 18 points 8 months ago

'How dare technology keep doing stuff?' is a deeply weird criticism.

This isn't like crypto bullshit, where finance-bro jackasses did databases in the least efficient possible way. We're pushing the boundaries of results-driven artificial intelligence, modeled on how biological brains work. Is it miraculous? Not exactly. But it's answering a lot of questions that were exciting forty-odd years ago and suddenly exploded into relevance due to parallel computing... intended for video games.

Bemoaning the last year-ish of outright witchcraft, based on the up-front costs of training models that will run on a phone, is a perspective that seems more performative than plausible.

[-] stabby_cicada@slrpnk.net 7 points 8 months ago* (last edited 8 months ago)

I deeply dislike the line of argument that goes "we shouldn't bother reducing our personal energy consumption because 100 corporations produce 70% of greenhouse gases" or similar arguments. Of course we should. Because it's the right thing to do.

But it's also true: those 100 corporations and their ilk absolutely promote a false narrative that personal responsibility is the solution to climate change, in order to prevent climate regulation that might harm their bottom line.

And frankly, I think that's what's going on here with panic over AI power consumption. Corporate lobbyists and PR creating yet another distraction to slow the course of climate regulation and guilting ordinary people for doing ordinary things in the process.

[-] skuzz@discuss.tchncs.de 2 points 8 months ago

Personal responsibility has always been capitalism’s mechanism for normalizing corpo behavior. The fake Native American trash commercial in the 70s, banning home cleaners that business can still use at industrial scale, buying new electric cars being somehow carbon better than just not being a vehicle consumer every five minutes, there are examples going even further back in time, but my brain doesn’t currently have enough caffeine to dig further back.

[-] mindbleach@sh.itjust.works 1 points 8 months ago

It's atomization of responsibility.

[-] mindbleach@sh.itjust.works 1 points 8 months ago

Considering one crosspost for this is the sneer-club hypocrites at awful.systems, there's also the interplay of bad-faith criticism and bad-faith excuses, for their own sake. Individual randos have picked an allegiance and will now engage in kneejerk loyalist ad-hoc justification, because they think that's how things work. Going from arguments to conclusions would be ridiculous, to them. Their claims are not intended to be evaluated.

Myriad douchebags have jumped from crypto to AI as the next buzzword cult that might make them hideously rich. People rightly condemning them also tend to jab at whatever bullshit they're pushing, now. Some of that's going to be legitimate perspective on an over-hyped autocomplete, powered by copying every book in the library, using racks of video cards running full-tilt 24/7. Some of that's going to be inane performative mockery of creative tools that border on magical and should scale down to translate speech right in your earbuds. All I can guarantee for you is that the aforementioned douchebags will not know the difference. Neither will their loudest critics. They'll both say 'you just don't understand!' because they're just shuffling cards and this deck is not deep.

[-] Immersive_Matthew@sh.itjust.works 12 points 8 months ago

The issue is how the electricity is generated not that it is needed in the first place. Such a great distraction from the real issue that it has got to be big oil that is spinning the story this way.

Let’s all hate on AI and crypto because they are ruining the entire environment and if we just stopped them, all would be fine with the planet again /s.

[-] blanketswithsmallpox@lemmy.world 8 points 8 months ago* (last edited 8 months ago)

Artificial intelligence requires a lot of power for much the same reason. The kind of machine learning that produced ChatGPT relies on models that process fantastic amounts of information, and every bit of processing takes energy. When ChatGPT spits out information (or writes someone’s high-school essay), that, too, requires a lot of processing. It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.

So 500 Megawatts a day across the globe? This is all just Data Center use? Not even 1/10th the power of the newest and largest data center's power... out of ~11,000 total data centers.

Existing markets are already struggling to meet demand, the report says. In Northern Virginia, the largest data center market in the world at 3,400MW, availability is running at just 0.2 percent.

https://www.datacenterdynamics.com/en/news/us-data-center-power-consumption/

So a drop in the bucket for a crazy useful tool using mostly existing infrastructure...

The finding that global data centers likely consumed around 205 terawatt-hours (TWh) in 2018, or 1 percent of global electricity use, lies in stark contrast to earlier extrapolation-based estimates that showed rapidly-rising data center energy use over the past decade (Figure 2).

https://energyinnovation.org/2020/03/17/how-much-energy-do-data-centers-really-use/

The typical cost of building a solar power plant is between $0.89 and $1.01 per watt. A 1MW (megawatt) solar farm can cost you between $890,000 and $1.01 million... According to GTM Research, 1 MW solar farms require 6–8 acres to accommodate all the necessary infrastructure and space between panel rows.

https://coldwellsolar.com/commercial-solar-blog/how-much-investment-do-you-need-for-a-solar-farm/

$300 million and ~2 square miles (7 for reference) to power the entire world's AI use feels like a non-issue to me. A billionaire could literally fund the entire world's daily consumption and not dent their holdings...

Computers use power... More news at 11.

[-] stabby_cicada@slrpnk.net 8 points 8 months ago

So I did a little math.

This site says a single ChatGPT query consumes 0.00396 KWh.

Assume an average LED light bulb is 10 watts, or 0.01 kwh/hr. So if I did the math right, no guarantees there, a single ChatGPT query is roughly equivalent to leaving a light bulb on for 20 minutes.

So if you assume the average light bulb in your house is on a little more than 3 hours a day, if you make 10 ChatGPT queries per day it's the equivalent of adding a new light bulb to your house.

Which is definitely not nothing. But isn't the end of the world either.

[-] AliasAKA@lemmy.world 5 points 8 months ago* (last edited 8 months ago)

It’s also the required energy to train the model. Inference is usually more efficient (sometimes not but almost always significantly more so), because you have no error back propagation or other training specific calculations.

Models probably take 1000 megawatts of energy to train (GPT3 took 284MW by OpenAI’s calculation). That’s not including the web scraping and data cleaning and other associated costs (such as cooling the server farms which is non trivial).

A coal plant takes roughly 364kg - 500kg of coal to generate 1 MWh. So for GPT3 you’d be looking at 103,376 kg (~230 thousand pounds, or 115 US tons) at minimum to train it. Nobody has used it and we’re not looking at the other associated energy costs at this point. For comparison, a typical home may use 6MWh per year. So just training GPT3 could’ve powered 47 homes for an entire year.

Edit: also, it’s not nearly as bad as crypto mining. And as another person says it’s totally moot if we have clean sources of energy to fill the need and the grid can handle it. Unfortunately we have neither right now.

[-] sushibowl@feddit.nl 2 points 8 months ago

If you amortize training costs over all inference uses, I don't think 1000MW is too crazy. For a model like GPT3 there's likely millions of inference calls to split that cost between.

[-] AliasAKA@lemmy.world 1 points 8 months ago

Sure, and I think that these may even be useful and it warrants the cost. But it is to just say that this still isn’t simply running a couple light bulbs or something. This is a major draw on the grid (but likely still pales in comparison to crypto farms).

Note that most people would be better off using a model that’s trained for a specific task. For example, training image recognition uses vastly less energy because the models are vastly smaller, but they’re exceedingly excellent at image recognition.

[-] Zaktor@sopuli.xyz 1 points 8 months ago

The article claims 200M ChatGPT requests per day. Assuming they make a new version yearly, that's 73B requests per training. Spreading 1000MW across 73B requests yields a per-request amortized cost of 0.01 watt. It's nothing.

47 more households-worth of electricity just isn't a major draw on anything. We add ~500,000 households a year from natural growth.

[-] Zink@programming.dev 2 points 8 months ago

I have a feeling it’s not going to be the ordinary individual user that’s going to drive the usage to problematic levels.

If a company can make money off of it, consuming a ridiculous amount of energy to do it is just another cost on the P & L.

(Assuming of course that the company using it either pays the electric bill, or pays a marked-up fee to some AI/cloud provider)

I mean, energy usage continued to skyrocket since 1850.

Why is that article so surprised by that?

[-] Daxtron2@startrek.website 0 points 8 months ago

The bigger companies focus on huge model sizes instead and ever increasing them. Lots of advanced are being made with smaller and more affordable models that can be run on consumer devices but the big companies don't focus on that as it can't generate as much profit.

[-] sonori@beehaw.org 4 points 8 months ago

The problem is that all of the current discussion and hype is about Chat GPT and similar whole internet models. They are not as useful as more specialized small model ones, but they also not as easy to hype.

[-] sacredmelon@slrpnk.net 0 points 8 months ago

This is concerning, why they just dont stop the never ending updates and just stick with the latest things we have for a moment? Isnt all the tech stuff we have sufficient for the world to keep going?

[-] ColonelPanic@lemm.ee 2 points 8 months ago
[-] sacredmelon@slrpnk.net 1 points 8 months ago

I got you up buddy dont worry

this post was submitted on 10 Mar 2024
276 points (92.9% liked)

Climate - truthful information about climate, related activism and politics.

5244 readers
226 users here now

Discussion of climate, how it is changing, activism around that, the politics, and the energy systems change we need in order to stabilize things.

As a starting point, the burning of fossil fuels, and to a lesser extent deforestation and release of methane are responsible for the warming in recent decades: Graph of temperature as observed with significant warming, and simulated without added greenhouse gases and other anthropogentic changes, which shows no significant warming

How much each change to the atmosphere has warmed the world: IPCC AR6 Figure 2 - Thee bar charts: first chart: how much each gas has warmed the world.  About 1C of total warming.  Second chart:  about 1.5C of total warming from well-mixed greenhouse gases, offset by 0.4C of cooling from aerosols and negligible influence from changes to solar output, volcanoes, and internal variability.  Third chart: about 1.25C of warming from CO2, 0.5C from methane, and a bunch more in small quantities from other gases.  About 0.5C of cooling with large error bars from SO2.

Recommended actions to cut greenhouse gas emissions in the near future:

Anti-science, inactivism, and unsupported conspiracy theories are not ok here.

founded 1 year ago
MODERATORS