2947
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 06 Oct 2023
2947 points (98.2% liked)
Piracy: κ±α΄ΙͺΚ α΄Κα΄ ΚΙͺΙ’Κ κ±α΄α΄κ±
54390 readers
494 users here now
β Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules β’ Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
π c/Piracy Wiki (Community Edition):
π° Please help cover server costs.
Ko-fi | Liberapay |
founded 1 year ago
MODERATORS
What if humans are also just LLMs when they start talking
Incorrect, humans have an understanding of the words they use, LLM's use statistical models to guess what word gets used.
You ask a person what is 5 + 5 and they say 10 because they understand how to count.
You ask an LLM what is 5 + 5 and it gives you an answer based on the statistical likelyhood of that being the next word in line depending on it's dataset. If you're dataset has wrong answers you'll get wrong answers.
I appreciate this, as I have saying this same thing. Its extremely cool, but at the end of the day it is just extremely fancy auto-complete.
It's a bit like saying a human being is a fancy worm. Technically it is true, we evolved from worms, still we are pretty special compared to worms
We use LLM feature throughout our lives. Often without realizing. But you talk your language perfectly not because you know all the grammar logically, you feel if itβs correct or not, and that is through training like LLMs do.
Mine was a comment to say that llms are not just fancy auto complete. Although technically an evolution, it is a bit like saying humans are fancy worms because evolved from worms
Ah I see π I seem to have misunderstood that a bit
Exactly like children who start learning to talk
Have you ever asked a kid who is starting to talk (1.5 - 3 years old) what 5 + 5 is? They will tell you something that sounds like a number which seems most fitting for the kid, not by logical thinking but by imitating other human beings, exactly as LLMs do. Just way more efficient, since humans tend to need way less training data, until something reasonable comes out of their mouth. Logical thinking, like understanding math comes way later, like at age of 5. source: My son.
Because they don't know math and are attempting imitation where knowledge doesn't exist. The LLM has knowledge and a statistical model. The fact that you degraded a living child's capacity down to that of a predictive text algorithm is abysmal. That child is already learning truth and objectivity and love and hope and so many things that are intangential and out of reach of an LLM.
I reduced to learning talking part of the human development. Of course there are way more mechanisms involved than the way LLMs work to throughly master talking (as we see on the results of todays LMM). But what I wanted to say is that I'm pretty sure that in our subconscious we use a very similar system to LLMs, especially for talking. I sign for that is in my opinion that people tend to acquire the regional tongue if they stay in the region for long enough. ππ»ββοΈ but in means Iβm any expert, this is just how this hole LLM feels to me.