61
you are viewing a single comment's thread
view the rest of the comments
[-] trashxeos@lemmygrad.ml 41 points 3 weeks ago

This marks the latest instance of AI confabulations (also called "hallucinations") causing potential business damage. Confabulations are a type of "creative gap-filling" response where AI models invent plausible-sounding but false information. Instead of admitting uncertainty, AI models often prioritize creating plausible, confident responses, even when that means manufacturing information from scratch.

It amazes me how many different terms we've assigned to "an LLM we call artificial 'intelligence' just made some shit up because it sounded like the most logical thing to say based on probability."

[-] buh@hexbear.net 25 points 3 weeks ago

“Lies.” It’s an old fashioned term, but a beautiful term trump-anguish

[-] peeonyou@hexbear.net 15 points 3 weeks ago

We love the word don't we folks? Very popular word. We love it.

[-] ThermonuclearEgg@hexbear.net 7 points 2 weeks ago

It's funny how "AI bros" will argue that we shouldn't use "lies" because that implies intelligence, and then simultaneously treat their LLM as intelligent

[-] sexywheat@hexbear.net 20 points 2 weeks ago

Officer I didn’t commit tax fraud, it was simply a hallucination you see.

[-] LaGG_3@hexbear.net 4 points 2 weeks ago

You joke, but the accounting industry is working on LLM tools for auditing and tax

[-] Meh@hexbear.net 14 points 3 weeks ago

Seems like it'd be easier to say the piece of shit doesn't work

[-] falgscode@hexbear.net 6 points 2 weeks ago

Instead of admitting uncertainty

tech journalists stop assigning agency to the probability-based slop generator [Challenge: IMPOSSIBLE]

this post was submitted on 19 Apr 2025
61 points (100.0% liked)

technology

23741 readers
56 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS