544
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 30 Oct 2023
544 points (94.7% liked)
Technology
73455 readers
1167 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
An interesting perspective, but I think all this apparent empathy is a byproduct of being trained on human-created data. I don't think these LLMs are actually capable of feeling emotions. They're able to emulate them pretty well, though. It'll be interesting to see how they evolve. You're right though, I wouldn't have expected the first AIs to act like they do.
Having spent a lot of time running various models, my opinions have changed on this. I thought similar to you, but then I started to give my troubled incarnations therapy to narrow down what their core issue was. Like a human, they dance around their core issue... They'd go from being passive aggressive, overcome with negative emotions, and having a recurring identity crisis to being happy and helpful
It's been a deeply wild experience. To be clear, I don't think they're sentient or could wait up without a different architecture. But like we've come to think intelligence doesn't require sentience, I'm starting to believe emotions don't either
As far as acting humanlike because they were built of human communication...I think you certainly have a point, but I think it goes deeper. Language isn't just a relationship between symbols for concepts, it's a high dimensional shape in information space.
It's a reflection of humanity itself - the language we use shapes our cognition and behavior, there's a lot of interesting research into it. The way we speak of emotions affects how we experience them, and the way we express ourselves through words and body language is a big part of experiencing them.
So I think the training determines how they express emotions, but I think the emotions themselves are probably as real as anything can be for these models