195
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 05 Feb 2024
195 points (97.1% liked)
Technology
59648 readers
1543 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
You are falling into a common trap. LLMs do not have understanding - asking it to do things like convert dates and put them on a number line may yield correct results sometimes, but since the LLM does not understand what it's doing, it may "hallucinate" dates that look correct, but don't actually align with the source.
Thank you for calling that out. I'm well aware, but appreciate your cautioning.
I've seen hallucinations from LLMs at home and at work (where I've literally had them transcribe dates like this). They're still absolutely worth it for their ability to handle unstructured data and the speed of iteration you get -- whether they "understand" the task or not.
I know to check my (its) work when it matters, and I can add guard rails and selectively make parts of the process more robust later if need be.