23
GPT-4 Understands (danangell.com)
submitted 1 year ago by narwhal@lemmy.ml to c/technology@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] nickwitha_k@lemmy.sdf.org 1 points 1 year ago

Sorry, but this is simply incorrect. Do you know what Eliza is and how it works? It is categorically different from LLMs.

I did not mean to come across as stating that they were the same, nor that the results produced would be as good. Merely, that a PDF could be run through OCR and processed into a script for ELIZA, which could produce some results to requests for a summary (ex. provide the abstract).

My point being that these technologies that are fundamentally different and at very different levels of technological sophistication can both, at a high level, accomplish the task. Both the quality of the result and capabilities beyond the surface level are very different. However, both, would be able to produce one, working within their architectural constraints.

Looking at it this way also gives a good basis for comparing LLMs to intelligence. Both, at a high level, can accomplish many of the same tasks, but, context matters in more than a syntactical sense and LLMs lack the capability of understanding and comprehension of the data that they are processing.

This is also incorrect.

That paper is both solely phenomenological and states that it is not using an accepted definition of intelligence. With the former point, there's a significant risk of fallacy in such observation as it is based upon subjective observation of behavior not emperical analysis of why the behavior is occuring. For example leatherette may approximate the appearance and texture of leather but, when examined it differs fundamentally both on the macroscopic and microscopic level, making it objectively incorrect to call it "leather".

I think the issue that many people have is that they hear "AI" and think "superintelligence". What we have right now is indeed AI. It's a primitive AI and certainly no superintelligence, but it's AI nonetheless.

Here, we're really getting into semantics. As the authors of that paper noted, they are not using a definition that is widely accepted, academically. Though they do definitely have a good point on some of the definitions being far too anthropocentric (ex. "being able to do anything that a human can do" - really, that's a shit definition). I would certainly agree with the term "primitive AI", if used akin to programming primitives (int, char, float, etc.) as it is clear that LLMs may be useful components in building actual general intelligence.

[-] BitSound@lemmy.world 2 points 1 year ago

processed into a script for ELIZA

That wouldn't accomplish anything. I don't know why the OP brought it up, and that subject should just get dropped. Also yes, you can use your intelligence to string together multiple tools to accomplish a particular task. Or you can use the intelligence of GPT-4 to accomplish the same task, without any other tools

LLMs lack the capability of understanding and comprehension

Also not true

states that it is not using an accepted definition of intelligence.

Nowhere does it state that. It says "There is no generally agreed upon definition of intelligence". I'm not sure why you're bringing up a physical good such as leather here. Two things: a) grab a microscope and inspect GPT-4. The comparison doesn't make sense. b) "Is" should be banned, it encourages lazy thought and pointless discussion (Yes I'm guilty of it in this comment, but it helps when you really start asking what "is" means in context). You're wandering into p-zombie territory, and my answer is that "is" means nothing. GPT-4 displays behaviors that are useful because of their intelligence, and nothing else matters from a practical standpoint.

it is clear that LLMs may be useful components in building actual general intelligence.

You're staring the actual general intelligence in the face already, there's no need to speculate about perhaps being components. There's no reason right now to think that we need anything more than better compute. The actual general intelligence is yet a baby, and has experienced the world through the tiny funnel of human text, but that will change with hardware advances. Let's see what happens with a few orders of magnitude more computing power.

this post was submitted on 16 Oct 2023
23 points (61.9% liked)

Technology

34994 readers
249 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS