380
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 26 Jun 2024
380 points (97.3% liked)
Technology
68867 readers
2504 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
That wouldn't address the bulk of the issue, only the most egregious examples of it.
For every funny output like "I asked for 1 ice cream, it's giving me 200 burgers", there's likely tens, hundreds, thousands of outputs like "I asked for 1 ice cream, it's giving 1 burger", that sound sensible but are still the same problem.
It's simply the wrong tool for the job. Using LLMs here is like hammering screws, or screwdriving nails. LLMs are a decent tool for things that you can supervision (not the case here), or where a large amount of false positives+negatives is not a big deal (not the case here either).