326
submitted 6 months ago* (last edited 6 months ago) by Allah@lemm.ee to c/technology@lemmy.world

LOOK MAA I AM ON FRONT PAGE

(page 5) 50 comments
sorted by: hot top controversial new old
[-] crystalmerchant@lemmy.world 0 points 6 months ago

I mean... Is that not reasoning, I guess? It's what my brain does-- recognizes patterns and makes split second decisions.

load more comments (1 replies)
[-] sp3ctr4l@lemmy.dbzer0.com 0 points 6 months ago* (last edited 6 months ago)

This has been known for years, this is the default assumption of how these models work.

You would have to prove that some kind of actual reasoning capacity has arisen as... some kind of emergent complexity phenomenon.... not the other way around.

Corpos have just marketed/gaslit us/themselves so hard that they apparently forgot this.

load more comments (2 replies)
[-] flandish@lemmy.world 0 points 6 months ago

stochastic parrots. all of them. just upgraded “soundex” models.

this should be no surprise, of course!

[-] 1rre@discuss.tchncs.de -1 points 6 months ago

The difference between reasoning models and normal models is reasoning models are two steps, to oversimplify it a little they prompt "how would you go about responding to this" then prompt "write the response"

It's still predicting the most likely thing to come next, but the difference is that it gives the chance for the model to write the most likely instructions to follow for the task, then the most likely result of following the instructions - both of which are much more conformant to patterns than a single jump from prompt to response.

load more comments (1 replies)
[-] mfed1122@discuss.tchncs.de -1 points 6 months ago* (last edited 6 months ago)

This sort of thing has been published a lot for awhile now, but why is it assumed that this isn't what human reasoning consists of? Isn't all our reasoning ultimately a form of pattern memorization? I sure feel like it is. So to me all these studies that prove they're "just" memorizing patterns don't prove anything other than that, unless coupled with research on the human brain to prove we do something different.

[-] count_dongulus@lemmy.world 0 points 6 months ago

Humans apply judgment, because they have emotion. LLMs do not possess emotion. Mimicking emotion without ever actually having the capability of experiencing it is sociopathy. An LLM would at best apply patterns like a sociopath.

load more comments (3 replies)
[-] Endmaker@ani.social -1 points 6 months ago

You've hit the nail on the head.

Personally, I wish that there's more progress in our understanding of human intelligence.

load more comments (1 replies)
load more comments (6 replies)
[-] LonstedBrowryBased@lemm.ee -2 points 6 months ago

Yah of course they do they’re computers

load more comments (14 replies)
load more comments
view more: ‹ prev next ›
this post was submitted on 08 Jun 2025
326 points (97.4% liked)

Technology

77680 readers
589 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS