859
Help. (mander.xyz)
you are viewing a single comment's thread
view the rest of the comments
[-] ZkhqrD5o@lemmy.world 43 points 1 day ago* (last edited 1 day ago)

One thing that comes to mind is that prostitution, no matter how you spin it, is still a social job. If you get a problematic person like that in prostitution, there are good chances that said prostitute would be able to talk their customer out of doing some nonsense. If not for empathy, for the simple fact that there would be legal consequences for not doing so.

Do you think a glorified spreadsheet that people call husband would behave the same? Don't know if it happened but one of these days LLMs will talk people into doing something very nasty and then it's going to be no one's fault again, certainly not the host of the LLM. We really live in a boring dystopia.

Edit: Also there's this one good movie which I forgot the name of, about a person talking to one of these LLMs as a girlfriend. They have a bizarre, funny and simultaneously creepy and disturbing scene where the main character who's in love with the LLM, hires a woman who puts a camera on her forehead to have sex with his LLM "girlfriend".

Also, my quite human husband also voices his thoughts without a prompt. Lol. You only need to feed him to function, no internet required.

[-] bitjunkie@lemmy.world 17 points 1 day ago

The movie you're thinking of is Her with Joaquin Phoenix and Scarlett Johansson, and in the story she's a true general AI.

[-] humanspiral@lemmy.ca 15 points 1 day ago

A problem with LLM relationships is the monetization model for the LLM. Its "owner" either receives a monthly fee from the user, or is able to get data from the user to monetize selling them stuff. So the LLM is deeply dependant on the user, and is motivated to manipulate a returned codependency to maintain its income stream. This is not significantly different than the therapy model, but the user can fail to see through manipulation compared to "friends/people who don't actually GAF" about maintaining a strong relationship with you.

[-] kazerniel@lemmy.world 2 points 1 hour ago

This is not significantly different than the therapy model, but the user can fail to see through manipulation compared to “friends/people who don’t actually GAF” about maintaining a strong relationship with you.

That's why therapists have ethical guidelines and supervision. (Also they are typically people who are driven to help, not exploit the vulnerable.) None of these are really present with those glorified autocompletes.

[-] humanspiral@lemmy.ca 2 points 29 minutes ago

one big difference between an AI friend and therapy is that therapy requires an effort per visit, even if insurance is providing unlimited access. Without acknowledging the power of ethical guidelines as guard rails, the LLM is motivated to sustain the subscription and datacollection stream.

Also, my quite human husband also voices his thoughts without a prompt. Lol. You only need to feed him to function, no internet required.

Sometimes, with humans, I'd say the problem is quite the opposite: they voice their thoughts without a prompt far more often than what would be desirable.

On a less serious note, that quoted part made me chuckle.

[-] ZkhqrD5o@lemmy.world 4 points 1 day ago

They shouldn't be so harsh to LLMs. They have something in common with humans after all. If you stick a patch cable with internet up theirs, they will become very talkative very quickly.

[-] shalafi@lemmy.world 5 points 1 day ago
[-] ZkhqrD5o@lemmy.world 9 points 1 day ago
[-] kazerniel@lemmy.world 1 points 1 hour ago

holy shit, that movie is 12 years old 👴

[-] jaemo@sh.itjust.works 2 points 1 day ago

Yeah but as a husband, we have waste heat to manage, plus just normal waste.

this post was submitted on 14 Aug 2025
859 points (98.8% liked)

Science Memes

16250 readers
2665 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS