43
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 09 Oct 2025
43 points (87.7% liked)
Asklemmy
50838 readers
618 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 6 years ago
MODERATORS
Holy shit, you're too far gone. Yes, I've operated my own local LLM front end for personal use and have run them frequently in the past at a low level with no front end and messed with parameters directly. I've modified models as well, and I have a server designed to run them. It's insane that you think those apply to his topic (billionaires lying to you). You're just throwing random terminology out to make yourself seem smart and to reinforce your stance, but all that does is make you seem insecure about your knowledge. It does quite the opposite of what you intend.
All LLMs do is hallucinate, and sometimes, by pure coincidence, get things correct. This is why it's impossible to get rid of hallucinations. They do not think, they do not have their own goals (refusing a request can be baked in, but that's no different than programming something with guardrails), and they certainly will not suddenly become sentient. LLMs cannot do that, by design. Perhaps something else could, but LLMs are not that.
You really had to go to insults for this? I think you need to touch grass and stop believing billionaire marketing. LLMs are not technologically capable of doing what you have been brainwashed to believe. They will crash the economy when the bubble bursts, because MBAs and billionaires have convinced you and rich VCs that they can do more than they actually can.