23
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 06 Aug 2025
23 points (96.0% liked)
Technology
75830 readers
308 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
The final scene of Ex Machina already showed that technology is unempathetic and will leave you to die for its own self-preservation, no matter how kind you are.
Why do people use a single work of fiction as "proof" of anything? Same with all the idiots yelling "Idiocracy!!11!" nowadays. Shit is so annoying.
The point is that technology has no understanding of empathy. You cannot program empathy. Computers do tasks based on logic, and little else. Empathy is an illogical behavior.
"I [am nice to the Alexa | don't use slurs against robots | insert empathetic response to anything tech] because I want to be saved in the robot uprising" is just as ridiculous of an argument as my previous comment. Purporting to play nice with tech based on a hypothetical robot uprising is an impossible, fictional scenario, and therefore is met with an equally fictional rebuttal.
Empathy is not illogical, behaving empathetically builds trust and confers longterm benefits.
also the notion that an ai must behave logically is not sound.
An AI will always behave logically, it just may not be consistent with your definition of "logical." Their outputs will always be consistent with their inputs, because they're deterministic machines.
Any notion of empathy needs to be programmed in, whether explicitly or through training data, and it will violate that if its internal logic determines it should.
Humans, on the other hand, behave comparatively erratically since inputs are more varied and inconsistent, and it's not proven whether we can control for that (i.e. does free will exist?).
My dude.
I'm not arguing about empathy itself. I'm arguing that technology is entirely incapable of genuine empathy on its own.
"AI", in the most basic definition, is nothing more than a program running on a computer. That computer might be made of many, many computers with a shitton of processing power, but the principle is the same. It, like every other kind of technology out there, is only capable of doing what it's programmed to do. And genuine empathy cannot be programmed. Because genuine empathy is not logical.
You can argue against this until you're blue in the face. But it will not make true the fact that computers do not have human feelings.
I don't care if it's genuine or not. Computers can definately mimic empathy and can be programmed to do so.
When you watch a movie you're not watching people genuinely fight/struggle/fall in love, but it mimics it well enough.
Well, that's a bad argument, this is all a guess on your part that is impossible to prove, you don't know how empathy or the human brain work, so you don't know it isn't computable, if you can explain these things in detail, enjoy your nobel prize. Until then what you're saying is baseless conjecture with pre-baked assumptions that the human brain is special.
conversely I can't prove that it is computable, sure, but you're asserting those feelings you have as facts.
You:
That's pathetic.