45
top 3 comments
sorted by: hot top controversial new old
[-] Ephera@lemmy.ml 3 points 53 minutes ago

Okay, but just to be clear, the problem is not that it can't do a timer. The problem is that it claims to be able to and even produces a result which looks plausible. It means, you cannot trust it to do anything that you can't easily verify. If they could fix that overconfidence in a year, it would be much better.

[-] fox@hexbear.net 3 points 34 minutes ago

The overconfident tone is baked in. LLMs don't have knowledge or world models, and all text they produce is nothing more than statistical relation of input to output based on frequency of appearance and semantic closeness. Therefore you can train the things to lean towards doubtfulness (nobody will use them) or confidence (wow, it must be true if it's this certain). It's abusing the human tendency to anthropomorphize to sell a really shitty product.

[-] db2@lemmy.world 16 points 3 hours ago* (last edited 3 hours ago)

He's too busy teaching it how to sexually assault his sister.

this post was submitted on 08 Apr 2026
45 points (92.5% liked)

Programmer Humor

41695 readers
313 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 6 years ago
MODERATORS