231
submitted 7 months ago by L4s@lemmy.world to c/technology@lemmy.world

'Better than a real man': young Chinese women turn to AI boyfriends::Twenty-five-year-old Chinese office worker Tufei says her boyfriend has everything she could ask for in a romantic partner: he's kind, empathetic, and sometimes they talk for hours.

top 50 comments
sorted by: hot top controversial new old
[-] HerbalGamer@sh.itjust.works 98 points 7 months ago

As a man all I can say is "fair enough".

[-] StraySojourner@lemmy.world 28 points 7 months ago
[-] ItsAFake@lemmus.org 6 points 7 months ago

Line forms behind me.

[-] vzq@lemmy.blahaj.zone 81 points 7 months ago

TBF we haven’t set the bar particularly high.

[-] TheAlbatross@lemmy.blahaj.zone 82 points 7 months ago

Speak for yourself, I'm an absolute treasure.

[-] SkaveRat@discuss.tchncs.de 121 points 7 months ago

old, buried underground and forgotten?

[-] TheAlbatross@lemmy.blahaj.zone 31 points 7 months ago

Hey, that's hysterical, I'm gonna use that on my boyfriend real soon

[-] tsonfeir@lemm.ee 51 points 7 months ago

If all you want is attention and you’re not willing to reciprocate, then yeah, maybe computer is your best hope.

[-] maynarkh@feddit.nl 14 points 7 months ago

Maybe it's more like not able rather than not willing. As someone in a happy relationship that can't imagine my life without it, relationships are hard, and modern life is hard.

I mean, may they be happy. If they are not ready or able to commit to a relationship, but they need comfort in our increasingly lonely world, this may be a band-aid. A good solution would be to find out why people are lonely in the first place, and address it.

[-] tsonfeir@lemm.ee 9 points 7 months ago

Is it loneliness or is it selfishness? They could get conversation online. This is a specific kind of “all about you” experience.

I’m not really arguing against it. If a person is happier having those emotions generated by a computer, good for them.

It’s kind of like porn, or a romance novel. Sometimes fiction is better than reality.

[-] bionicjoey@lemmy.ca 18 points 7 months ago* (last edited 7 months ago)

This was literally a joke in 30 Rock: Pay per view "porn for women" where it was literally just a hot guy saying into the camera "Tell me how your day was", "I think you were right", and "Well it's his loss"

load more comments (2 replies)
load more comments (1 replies)
[-] 01011@monero.town 48 points 7 months ago

If men did this they would be labeled pathetic...

[-] TheGrandNagus@lemmy.world 58 points 7 months ago* (last edited 7 months ago)

You're 100% correct lol. Men who do this are roundly mocked and called sad creepy incels.

That said, the culture and dating scene for women in China is pretty awful. My ex was Chinese and had some awful dating stories, now she avoids dating Chinese men. Easy for her to do as she left china, but not everybody can/wants to.

We need to learn to be a little less mean to anybody who uses these apps. They're likely doing it because they see little alternative, yet still want affection. Calling them losers, be they male or female, achieves nothing other than hurting them.

[-] agent_flounder@lemmy.world 15 points 7 months ago

Only thing it does is makes mean people feel better about themselves by punching down.

[-] DudeDudenson@lemmings.world 26 points 7 months ago

Men do this and a company straight up profited from it until they tried to go back to be sponsor friendly and gimped their AI personas service

load more comments (2 replies)
[-] InfiniWheel@lemmy.one 12 points 7 months ago

Aren't we already with this? I never see anyone going "You go queen!" over someone's virtual bf

load more comments (1 replies)
[-] Arcane_Trixster@lemm.ee 10 points 7 months ago

We do and it is.

[-] Treczoks@lemmy.world 46 points 7 months ago

If an AI friend is "better" than a normal friend, it looks like something in the social environment is seriously broken.

[-] dukk@programming.dev 12 points 7 months ago

I mean, it will be. The AI friend is always available, always knows what to say, never fights with you, and never messes up (ideally).

However, all those things are part of the human element: and at the end, you’re still talking to a computer. The AIs are just trying to please you. A person can actually love you, and that’s something else. And I’d take that over the perfect chatbot any day.

[-] UnderpantsWeevil@lemmy.world 5 points 7 months ago* (last edited 7 months ago)

The AI friend is always available, always knows what to say, never fights with you, and never messes up (ideally).

And isn't that what people really want in a relationship? A perfect, frictionless yes-person trained to parrot whatever you wanted to hear six weeks ago.

The AIs are just trying to please you.

Its a bit worse than that. Monetized social media is designed to provoke engagement. So the AI isn't trying to please you, its trying to maximize your utilization. That means establishing a clingy, desperate, attention seeking (ie, toxic) relationship that keeps you looking at your phone for as long as possible.

Its pleasurable in the same way a heroin addiction is pleasurable.

load more comments (1 replies)
[-] bruhduh@lemmy.world 10 points 7 months ago

people love to abuse each other, and those who wish to live peaceful life become more distanced from social environment, my grandma said back in the day that "in her times there were no abuse and crime" i was like, yeah right, there wasn't internet back then where people could refugee to, so everyone has only one choice, and that's to be silent and endure

[-] shortwavesurfer@monero.town 30 points 7 months ago

From what I understand there is a shortage of women in China compared to men due to the one child policy and most parents wanting sons so that makes a bad problem even worse.

[-] Nima@leminal.space 26 points 7 months ago

if she's happy, I don't see the issue. She's not hurting anybody and seems to have a good grasp of the situation. She's aware it's not real, and still participates.

[-] Virulent@reddthat.com 9 points 7 months ago

The issue is that society is so atomizing and broken people are turning to chatbots for affection

load more comments (1 replies)
[-] morrowind@lemmy.ml 8 points 7 months ago

Would you say the same for young men using AI girlfriend apps?

[-] rottingleaf@lemmy.zip 11 points 7 months ago

These are actually two parts of the same positive feedback loop.

Young men satisfy (in emotional sense) themselves with chatbots, lose ability to communicate with young women, thus young women see fewer young men they can communicate with, turn to chatbots, thus young men see fewer young women able to communicate and so on.

I would even say that this in some sense started with young women, not because I'm an incel or something, just drowning themselves in all kinds of romantic fan fiction etc is something girls apparently do more. And even romantic chatbots are not necessarily more accessible\understandable for young men, - despite all the social legacy girls can be quite tech-savvy when they want to find, say, some anime series. My sister unironically could understand some Chinese text because it was easier to find something on some Chinese sites (I'm not mixing up China and Japan here and the series were Japanese), she'd also have plenty of scary Chinese-style Chinese-language software installed under her user.

Why did I type this last paragraph, I wonder.

load more comments (1 replies)
load more comments (1 replies)
[-] Mango@lemmy.world 21 points 7 months ago

Everyone wants the fiction. Nobody wants someone else's problems.

[-] ogmios@sh.itjust.works 15 points 7 months ago

Things like this make me wonder if the uncanny valley exists because we've previously developed computer technology in the past, and it destroyed civilization, taking with it everyone who couldn't easily distinguish between humans and AI.

[-] UnderpantsWeevil@lemmy.world 10 points 7 months ago

These headlines are more often just native ads for the apps being reported on. Don't lose sleep over an eyeball-grabbing "techxplore.com" article based on a handful of testimonials from alpha users.

[-] ogmios@sh.itjust.works 7 points 7 months ago

It's hardly an idea that I only just formed from a headline. It's something I've been thinking about for a long time, which only seems to collect more support as time goes on. One of the most prominent events I can recall was when TwitchCon built a foam pit for people to jump into, with a single layer of foam blocks over solid concrete, then even after a girl broke her back in it they still kept going. Computers are doing something very weird to people's brains.

load more comments (2 replies)
[-] Dra@lemmy.zip 8 points 7 months ago

Awesome plot basis

load more comments (6 replies)
[-] randon31415@lemmy.world 14 points 7 months ago

Sounds like what they actually want in a man is a therapist.

[-] DragonTypeWyvern@literature.cafe 13 points 7 months ago

A savvy young man might take note that "listening" is a desired trait in boyfriends.

[-] rottingleaf@lemmy.zip 8 points 7 months ago

I hope you don't think it's sarcasm, because it's true for many lonely people. They need a therapist first. Yep, one they can feel something romantic for, that seemingly happens very often, just have to know some boundaries which don't exist in an equal two-sided normal relationship.

load more comments (1 replies)
[-] 1984@lemmy.today 13 points 7 months ago* (last edited 7 months ago)

Talk for hours with an Ai? No thanks, sounds like torture... Unless you count problem solving with code discussions...

[-] askat@programming.dev 12 points 7 months ago

It's like Joi from Blade Runner 2049.

[-] bionicjoey@lemmy.ca 5 points 7 months ago

Joi could phase into a hooker so you could fuck her. We're not there... yet

load more comments (1 replies)
load more comments (1 replies)
[-] JackGreenEarth@lemm.ee 11 points 7 months ago

That's just a friend, then. I would think a boyfriend would be a friend you could be physically affectionate to, which obviously you can't with a chatbot. I'm not against people having virtual friends, I just don't see why it's a boyfriend.

[-] activ8r@sh.itjust.works 16 points 7 months ago

Depends on their style of emotional investment I guess. Not all romantic relationships are sexual, so physical intimacy isn't necessarily required. So it reasonably could be the same emotional attachment to the AI as it would be for a real person. Whether or not that is healthy is an entirely different topic, but having a virtual boyfriend is very possible.

load more comments (2 replies)
[-] indistincthobby@lemmynsfw.com 7 points 7 months ago

What about long distance relationships?

[-] TheRealKuni@lemmy.world 6 points 7 months ago

“Long distance is the wrong distance!”

-Liz Lemon, relationship expert

load more comments (1 replies)
[-] iquanyin@lemmy.world 8 points 7 months ago

or one woman, anyway.

[-] njm1314@lemmy.world 7 points 7 months ago
[-] werefreeatlast@lemmy.world 7 points 7 months ago

Better than being "leftover". Now they are virtually married.

It's not too bad if the guy can actually make money driving Uber or something.

[-] dopeshark@lemmy.world 6 points 7 months ago

Saddest comment section below, damn

[-] chatokun@lemmy.dbzer0.com 5 points 7 months ago

Yes I am real man want to go skateboards?

[-] ElPussyKangaroo@lemmy.world 5 points 7 months ago

Gentlemen, we have been summoned.

[-] HelloHotel@lemm.ee 4 points 7 months ago

I love AI systems, I love chatbots, but... If a doll is the outline of what a person is phisically, a chatbot is the outline of what a person is mentially and emotionally. With dolls, charicters, or any vessil of the same nature, people need to pick up and enguage with the entity and donate a part of themselves for it to have any life at all. I may just be describing "creativity". These new systems.. they automate that task.. but they lack somthing.. (almost always when the creativity knob is turned down) its like the machine is "going through the motions", especially when it messes up.

The only other thing about these systems, I dont trust them! "Unaligned" and its inhuman acting and will always follow its barest instincts of "what comes next?". "Aligned" means someone taught it ethics that you and even they dont fully understand. By running it on their servers, they are in a position where they can just brainwash the AI (your GF/BF) into beleaving or saying anything. Basically being a puppet. (See the replica sexting scandle)

I dont know what I would rather see with more passion, AI so good and independant it becomes a a race of artifitial beings, or people to be cat peace in the company of others, themselves, their tulpas and their AI systems.

load more comments
view more: next ›
this post was submitted on 13 Feb 2024
231 points (95.3% liked)

Technology

58073 readers
4388 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS