61
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 23 Apr 2026
61 points (96.9% liked)
Asklemmy
54075 readers
514 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 7 years ago
MODERATORS
They're fucked.
Local models are already winning. Those benchmarked a year behind the biggest of big boys, a year ago. Six months ago they were six months behind. Yesterday Qwen released 3.6 27B and it outperforms 3.5 397B... from February.
Either we're plateauing toward the asymptotic limit of LLM capabilities, and the endgame runs as well on a toaster as it does on a server - or breakthroughs use big fat models as a glorified search space to be rapidly discarded. Both options point toward neural networks as a lump of algebra that sits on your hard drive and occasionally spins your fans. Remote computing loses, as it basically always must, and the drastically reduced requirements for competing on local software favor clever new competitors who aren't a bajillion dollars in debt.
I agree with this. I have an openclaw setup since I want to own my own data and services. A few months ago Sonnet was the clear leader for general use task for me. Now Gemma 4 performs nearly as well hosted off my gaming PC. Based on resource utilization, I actually think I can run it from the same nuc that openclaw is hosted from.