501
submitted 6 months ago by misk@sopuli.xyz to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] ALostInquirer@lemm.ee 2 points 6 months ago

perception

This is the problem I take with this, there's no perception in this software. It's faulty, misapplied software when one tries to employ it for generating reliable, factual summaries and responses.

[-] xthexder@l.sw0.com -1 points 6 months ago* (last edited 6 months ago)

I have adopted the philosophy that human brains might not be as special as we've thought, and that the untrained behavior emerging from LLMs and image generators is so similar to human behaviors that I can't help but think of it as an underdeveloped and handicapped mind.

I hypothesis that a human brain, who's only perception of the world is the training data force fed to it by a computer, would have all the same problems the LLMs do right now.

To put it another way... The line that determines what is sentient and not is getting blurrier and blurrier. LLMs have surpassed the Turing test a few years ago. We're simulating the level of intelligence of a small animal today.

this post was submitted on 17 May 2024
501 points (94.8% liked)

Technology

59670 readers
1539 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS