1592
Google AI making up recalls that didn’t happen
(lemmy.world)
This is a most excellent place for technology news and articles.
Because lies require intent to deceive, which the AI cannot have.
They merely predict the most likely thing that should next be said, so "hallucinations" is a fairly accurate description