1298
you are viewing a single comment's thread
view the rest of the comments
[-] jol@discuss.tchncs.de 2 points 3 months ago* (last edited 3 months ago)

I know, and accept that. You can't just tell an LLM not to halucinate. I would also not trust that trust score at all. If there's something LLMs are worse than accuracy, is maths.

this post was submitted on 01 Dec 2025
1298 points (99.1% liked)

Programmer Humor

30344 readers
2620 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS