17
you are viewing a single comment's thread
view the rest of the comments
[-] JumpyWombat@lemmy.ml 1 points 1 day ago

I do not believe that LLMs will ever be able to replace humans in tasks designed for humans. The reason is that human tasks require tacit knowledge (=job experience) and that stuff is not written down in training material.

However, we will start to have tasks for LLMs pretty soon. It was already observed that LLMs work better on stuff produced by other LLMs.

[-] vermaterc@lemmy.ml 2 points 23 hours ago

To be fair, not all knowledge of LLM comes from training material. The other way is to provide context to instructions.

I can imagine someone someday develops a decent way for LLMs to write down their mistakes in database and some clever way to recall most relevant memories when needed.

[-] JumpyWombat@lemmy.ml 1 points 22 hours ago

You sort of described RAG. It can improve alignment, but the training is hard to overcome.

See Grok that bounces from “woke” results to “full nazi” without hitting the mid point desired by Musk.

[-] yogthos@lemmy.ml 1 points 23 hours ago

there are already existing approaches tackling this problem https://github.com/MemTensor/MemOS

this post was submitted on 22 Jul 2025
17 points (84.0% liked)

Technology

39019 readers
165 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 6 years ago
MODERATORS