view the rest of the comments
news
Welcome to c/news! Please read the Hexbear Code of Conduct and remember... we're all comrades here.
Rules:
-- PLEASE KEEP POST TITLES INFORMATIVE --
-- Overly editorialized titles, particularly if they link to opinion pieces, may get your post removed. --
-- All posts must include a link to their source. Screenshots are fine IF you include the link in the post body. --
-- If you are citing a twitter post as news please include not just the twitter.com in your links but also nitter.net (or another Nitter instance). There is also a Firefox extension that can redirect Twitter links to a Nitter instance: https://addons.mozilla.org/en-US/firefox/addon/libredirect/ or archive them as you would any other reactionary source using e.g. https://archive.today . Twitter screenshots still need to be sourced or they will be removed --
-- Mass tagging comm moderators across multiple posts like a broken markov chain bot will result in a comm ban--
-- Repeated consecutive posting of reactionary sources, fake news, misleading / outdated news, false alarms over ghoul deaths, and/or shitposts will result in a comm ban.--
-- Neglecting to use content warnings or NSFW when dealing with disturbing content will be removed until in compliance. Users who are consecutively reported due to failing to use content warnings or NSFW tags when commenting on or posting disturbing content will result in the user being banned. --
-- Using April 1st as an excuse to post fake headlines, like the resurrection of Kissinger while he is still fortunately dead, will result in the poster being thrown in the gamer gulag and be sentenced to play and beat trashy mobile games like 'Raid: Shadow Legends' in order to be rehabilitated back into general society. --
No not, actually. The bubble is in the idea that AI requires large amounts of power, cooling, and processing throughput to achieve things like the current OpenAI O1 Reasoning and Logic models. The circle is like this:
The New AI Model Is Bigger --> Needs Bigger Hardware --> Bigger Hardware Needs Better Cooling --> More Cooling and Bigger Hardware Needs More Power --> More Cooling and Bigger Hardware means we can train the next Bigger Model --> Back to Start
So long as the Newest AI model is "bigger" then the last AI model, then everyone in this chain keeps making more money and has higher demand.
However, what Deepseek has done is put out an equivalent to the newest AI model that:
A) Required less up front money to train,
B) Uses considerably less resources than the previous model,
C) Is released on an Open Source MIT License, so anyone can host the model for their use case.
Now the whole snake is unraveling because all this investment that was being dumped into power, cooling, and hardware initiatives are fucked because less power and cooling is required, and older hardware can run the model.
The fact that the US model can eat shit as soon as somebody figures out a way to make it work better and faster instead of forcing a bunch of bloat and ad-infinitum upgrading is hilarious to me.
If that's not a perfect distillation of the infinitely wasteful US economy I don't know what is.
That was a great explanation, thanks!