69
all 14 comments
sorted by: hot top controversial new old
[-] ShimmeringKoi@hexbear.net 8 points 11 hours ago

Anything anyone does to this infrastructure is mass self-defense

[-] GeckoChamber@hexbear.net 29 points 17 hours ago

A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour

How do all these articles keep making this same mistake? Can we not get someone who passed high school physics to write these?

Since the article does not do a reasonable comparison to explain what these numbers actually mean, the estimated average 18 Wh query is very roughly the equivalent of having a TV on for 10 minutes.

[-] FunkyStuff@hexbear.net 16 points 17 hours ago

A 1GW nuclear power plant generated 1GWh of energy per hour blob-no-thoughts

[-] dat_math@hexbear.net 9 points 17 hours ago

This guy remembers the fundamental theorem

[-] SorosFootSoldier@hexbear.net 19 points 18 hours ago

Granted I'm not a techie guy and can't code, but like, isn't there some way to do AI shit but smarter, more efficient, and less fucking wasteful? Is it a coding problem or is it just the nature of the beast that these things drink up entire lakes to spit out the wrong answers to math equations?

[-] queermunist@lemmy.ml 23 points 17 hours ago

The bubble is built around scale, so more bigger = more better

They're making their chatbots less efficient to please investors who just want biggatons.

[-] zongor@hexbear.net 4 points 12 hours ago* (last edited 12 hours ago)

Yes it’s called quantization; it’s like zip file for a LLM model. You can get it small enough to run on a raspberry pi (like 5 amps) and although there is loss in “intelligence” it is still useable for a lot of scenarios. Look up ollama or llama.cpp for details

[-] RedWizard@hexbear.net 26 points 18 hours ago

It's a capital problem. What you're talking about is what made DeepSeek so disruptive; they made it eat less resources by optimizing it. Except, in America, Sam Altman also has his hands in the nuclear power industry for explicitly powering AI datacenters. So its all gravy. If the new model eats eight times more power, then that means the market for nuclear power has more demand, which means more business down the pipe. Then you have to consider this model probably requires even more powerful GPUs (DeepSeek could run on less powerful GPUs), which is good for the GPU market and APU market. We get to throw out all the year and a half old compute modules in favor of the next compute module.

[-] FunkyStuff@hexbear.net 20 points 18 hours ago

Still worth noting that, while DeepSeek is a huge improvement over American AI firms, they still don't really have a solution for scaling up how smart their models are compared to the others; they can just make an equivalent model for much cheaper. So it doesn't really solve the problem the AI firms are trying to skirt their way around, which is that they can't deliver on the "today ChatGPT can count the r's in strawberry, in 3 years it will be able to build a space station" type promises when the models don't scale.

[-] BeamBrain@hexbear.net 6 points 17 hours ago

Yeah this is why Deepseek is the only GenAI I still use.

[-] FunkyStuff@hexbear.net 15 points 18 hours ago* (last edited 18 hours ago)

The way in which these particular machines operate means that you need to make them guzzle 10x as much data, water, energy, or whatever other metric to get a meager improvement in how smart they are. 5 years ago, finding ways to scale them to be thousands of times larger was pretty easy, but now they're coming up against the limitations and trying to break through the limits by just burning up more resources instead of slowing down to find a better approach.

[-] mrfugu@hexbear.net 4 points 16 hours ago

Man I’m so sick of this narrative

this post was submitted on 15 Aug 2025
69 points (100.0% liked)

technology

23912 readers
220 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS