362
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

Tech experts are starting to doubt that ChatGPT and A.I. ‘hallucinations’ will ever go away: ‘This isn’t fixable’::Experts are starting to doubt it, and even OpenAI CEO Sam Altman is a bit stumped.

you are viewing a single comment's thread
view the rest of the comments
[-] malloc@lemmy.world 21 points 1 year ago

I was excited for the recent advancements in AI, but seems the area has hit another wall. Seems it is best to be used for automating very simple tasks, or at best used as a guiding tool for professionals (ie, medicine, SWE, …)

[-] Zeth0s@lemmy.world 25 points 1 year ago

Hallucinations is common for humans as well. It's just people who believe they know stuff they really don't know.

We have alternative safeguards in place. It's true however that current llm generation has its limitations

[-] rambaroo@lemmy.world 2 points 1 year ago

Humans can recognize and account for their own hallucinations. LLMs can't and never will.

[-] uranos@sh.itjust.works 2 points 1 year ago

It's pretty ironic that you say they "never will" in this context.

load more comments (1 replies)
load more comments (20 replies)
load more comments (24 replies)
this post was submitted on 02 Aug 2023
362 points (94.1% liked)

Technology

58164 readers
3664 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS