133
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 02 Oct 2025
133 points (96.5% liked)
Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ
64615 readers
175 users here now
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules • Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
📜 c/Piracy Wiki (Community Edition):
🏴☠️ Other communities
FUCK ADOBE!
Torrenting/P2P:
- !seedboxes@lemmy.dbzer0.com
- !trackers@lemmy.dbzer0.com
- !qbittorrent@lemmy.dbzer0.com
- !libretorrent@lemmy.dbzer0.com
- !soulseek@lemmy.dbzer0.com
Gaming:
- !steamdeckpirates@lemmy.dbzer0.com
- !newyuzupiracy@lemmy.dbzer0.com
- !switchpirates@lemmy.dbzer0.com
- !3dspiracy@lemmy.dbzer0.com
- !retropirates@lemmy.dbzer0.com
💰 Please help cover server costs.
![]() |
![]() |
---|---|
Ko-fi | Liberapay |
founded 2 years ago
MODERATORS
You're joking right? "making up answers" in the case of search results just means a dead link. If you get a good link 99% of the time and don't have to use an enshitified service, that's good enough for 99% of people. Try again is the worst case scenario.
Finding search terms is the one task I consistently use LLMs for. They did not say that though, they said replacing traditional search with LLMs, that traditional search is about to "go the way of the dinosaur". I dont trust any local LLM to accurately recall anything it read.
Not to mention that once we gain dependence on LLMs, which is something big tech is trying really hard to achieve right now, it will not be all that difficult for the creators to introduce biases that give us many of the same problems as search engines. Product placement, political censorship, etc. There would not be billions of dollars in investment if they thought they weren't going to get anything out of it.
(the best) Local LLMs are FOSS though, if bias is introduced it can be detected and the user base can shift away to another version, unlike centralized cloud LLMs that are private silos.
I also don't think LLMs of any kind will fully replace search engines, but I do think they will be one of a suite of ML tools that will enable running efficient local (or distributed) indexing and search of the web.
First of all, they are not FOSS. I know it seems tangential to the discussion, but it's important because biases cannot be reliably detected without the starting data. You should also not trust humans to see bias because humans themselves are quite biased and will generally assume that the LLM is behaving correctly if it aligns with their biases, which can be shifted in various ways over time, too.
Second, local LLMs don't have the benefit of free software where we can modify them freely or make forks if there are problems. Sure, there's fine tuning, but you don't get full control that way, and you need access to your own tuning data set. We would really just have the option to switch products, which doesn't put us much further ahead than using the closed off products available online.
I'm all for adding them to the arsenal of tools, but they are deceptively difficult to use correctly, which makes it so hard for me to be excited about them. I hardly see anyone using these tools for the purposes they are actually good for, and the things they are good for are also deceptively limited.