182
you are viewing a single comment's thread
view the rest of the comments
[-] lily33@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

It's not that nobody took the time to understand. Researchers have been trying to "un-blackbox" neural networks pretty much since those have been around. It's just an extremely complex problem.

Logistic regression (which is like a neural network but with just one node) is pretty well understood - but even then sometimes it can learn some pretty unintuitive coefficients and it can be tricky to understand why.

With LLMs - which are enormous by comparison - it's simply not a tractable problem to understand how it works in detail.

this post was submitted on 28 Jul 2023
182 points (94.6% liked)

Technology

59623 readers
2198 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS