136
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 11 Aug 2025
136 points (99.3% liked)
Technology
39967 readers
112 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
The analogy I use is, it's like a magician pulled a coin from behind a CEO's ear, and their response was "that's incredible! Free money! Let's go into business together!"
Literally no one ever claimed it had reasoning capabilities. It is a trick to produce a string of characters that your brain can make sense of. That's all.
Altman and similar grifters were and are absolutely making those claims but maybe we're excusing them as obvious liars?
They are obvious liars. Some people are just too invested to see it.
These models only have reasoning capabilities using the most obscure definitions of "reasoning". At best, all they're doing are climbing to local maxima with their so-called "reasoning" on a graph as wavy as the ocean.
I've mentioned this on other posts, but it's really sad because LLMs have been wildly incredible for certain NLP operations. They are that though, not AGI or whatever snake oil Altman wants to sell this week.
The CEOs you're talking about are the CEOs in the analogy.