346
AI Computing on Pace to Consume More Energy Than India, Arm Says
(news.bloomberglaw.com)
This is a most excellent place for technology news and articles.
Sounds like some sensationalized bullshit. They don't give a single number or meaningful statement and they are paywalled.
I don't disagree that they should back up their claim, but it does intuitively make sense. AI - GPT LLMs in particular - are typically designed to push the limits of what modern hardware can provide - essentially eating whatever power you can throw at it.
Pair this with a huge AI boom and corporate hype cycle, and it wouldn't surprise me if it was consuming an incredible amount of power. It's reminiscent of Bitcoin, from a resource perspective.
No, it makes no sense. India has over a billion people. There's no way that amount of computing power could just magically have poofed into existence over the past few years, nor the power plants necessary to run all of that.
If only there had been another widespread, wasteful prior use of expensive and power hungry compute equipment that suddenly became less valuable/effective and could quickly be repurposed to run LLMs...
Pretty sure the big AI corps aren't depending on obsolete second-hand half-burned-out Ethereum mining rigs for their AI training.