1297
you are viewing a single comment's thread
view the rest of the comments
[-] jol@discuss.tchncs.de 2 points 1 month ago* (last edited 1 month ago)

I know, and accept that. You can't just tell an LLM not to halucinate. I would also not trust that trust score at all. If there's something LLMs are worse than accuracy, is maths.

this post was submitted on 01 Dec 2025
1297 points (99.1% liked)

Programmer Humor

28641 readers
881 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS