1034
Mayonnaise Rule (files.catbox.moe)
submitted 9 months ago by Gork@lemm.ee to c/196@lemmy.blahaj.zone
you are viewing a single comment's thread
view the rest of the comments
[-] megopie@lemmy.blahaj.zone 34 points 9 months ago

Yah, people don’t seem to get that LLM can not consider the meaning or logic of the answers they give. They’re just assembling bits of language in patterns that are likely to come next based on their training data.

The technology of LLMs is fundamentally incapable of considering choices or doing critical thinking. Maybe new types of models will be able to do that but those models don’t exist yet.

[-] CurlyMoustache@lemmy.world 13 points 9 months ago* (last edited 9 months ago)

A grown man I work with, he's in his 50s, tells me he asks ChatGPT stuff all the time, and I can't for the life of me figure out why. It is a copycat designed to beat the Turing test. It is not a search engine or Wikipedia, it just gambles it can pass the Turing test after every prompt you give it.

[-] qGuevon@lemmy.world 1 points 9 months ago

It works well if you know what to use it for. Ever had something you wanted to Google, but couldn't figure out the keywords? Ever saw someone use a specific technique of something, which you could describe, but wouldn't be able to find unless someone on a forum asked the same question? That's were chatgpt shines.

Also for code it's pretty sweet

But yeah, it's not a wiki or hard knowledge retriever, but it might help connect the dots

load more comments (7 replies)
load more comments (8 replies)
this post was submitted on 10 Feb 2024
1034 points (100.0% liked)

196

16593 readers
3159 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS