98
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

Uncharted territory: do AI girlfriend apps promote unhealthy expectations for human relationships?::Chatbots such as Eva AI are getting better at mimicking human interaction but some fear they feed into unhealthy beliefs around gender-based control and violence

top 50 comments
sorted by: hot top controversial new old
[-] luthis@lemmy.nz 75 points 1 year ago

Isn't there a rule about when headlines ask a question the answer is 'no'?

[-] RagingNerdoholic@lemmy.ca 20 points 1 year ago
[-] krab@lemmy.world 3 points 1 year ago

It's interesting to note that in the three studies cited in the Wikipedia article, the plurality of the answers to the headline-questions studied were "yes"

[-] Hamartiogonic@sopuli.xyz 13 points 1 year ago

Very often the author wants to say something in order to attract more clicks, but they know they can’t get away with it without being called out or sued. That’s when question headlines come in, because this way they always leave the back door open. It’s very rare for the question to be there for any other reason.

[-] Taako_Tuesday@lemmy.ca 6 points 1 year ago

I remember being taught this in my high school journalism class, definitely one of the most valuable things I learned in high school

[-] jocanib@lemmy.world 3 points 1 year ago

There are exceptions to the rule, and this is one of them.

The rule works so well because journalists who can make a statement of fact, make a statement of fact. When they can't stand the idea up, they use a question mark for cover. eg China is in default on a trillion dollars in debt to US bondholders. Will the US force repayment? .

This is an opinion piece which is asking a philosophical question. The rule does not apply.

[-] HobbitFoot@thelemmy.club 6 points 1 year ago
[-] Cethin@lemmy.zip 3 points 1 year ago* (last edited 1 year ago)

I would guess for most people it's no. However, I would also expect this to appeal to the people where the answer is more likely to be yes. Those people are also the most vulnerable to the incel messaging though, which that will absolutely promote unhealthy expectations for relationships, so is this a net positive or negative? Idk.

[-] Madrigal@lemmy.world 4 points 1 year ago
[-] bionicjoey@lemmy.ca 10 points 1 year ago

I'm pretty sure Cunningham's Law says that energy is conserved in a closed physical system

[-] Bakkoda@sh.itjust.works -3 points 1 year ago

No FAP November is right around the corner.

[-] rikonium@discuss.tchncs.de 5 points 1 year ago* (last edited 1 year ago)

That’s Cumminghand’s Law!

[-] gapbetweenus@feddit.de 43 points 1 year ago* (last edited 1 year ago)

To me the concept of an app optimised to create deep emotional attachment ( far beyond social media, or even para social relationships with online personalities ) for monetary gain, is sketchy at best - heavily dystopian vibes at worst.

[-] brsrklf@compuverse.uk 18 points 1 year ago

It is.

Sarah Z made a video where she gets into some of the darker parts of Replika's concept and evolution. It's a fairly stinky business model.

https://youtu.be/3WSKKolgL2U

[-] PipedLinkBot@feddit.rocks 9 points 1 year ago

Here is an alternative Piped link(s): https://piped.video/3WSKKolgL2U

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

[-] gapbetweenus@feddit.de 0 points 1 year ago

Too long and not my style. But I think you just need a bit of imagination to see the problems that will arise - especially with how many frustrated young guys seem to be out there.

[-] whenever8186@feddit.uk 2 points 1 year ago
[-] gapbetweenus@feddit.de 2 points 1 year ago

But also it will make shit ton of money.

[-] kowanatsi@lemmy.ml 19 points 1 year ago

I think at that point you’ve kinda given up on human to human based relationship so it’s moot

[-] balder1991@lemmy.world 9 points 1 year ago

If it prevents school mass shootings, I’m all for it.

[-] shiroininja@lemmy.world 2 points 1 year ago

Yeah who cares what they do as long as they’re getting whatever they need out of it. Not my farm, not my heffers. As long as that heffer doesn’t come trampling the kids.

[-] Hazdaz@lemmy.world 19 points 1 year ago

Hollywood romantic comedies have been promoting unhealthy expectations for human relationships for decades now, so why would AI be all that worse?

[-] paddirn@lemmy.world 7 points 1 year ago

I would also worry about the privacy aspects, as people tend to reveal pretty personal information to each other inside of relationships. What happens when somebody reveals something illegal to an AI chatbot partner? Suddenly your partner is ratting you out to the cops, which admittedly could happen in real life anyways, but in general how much privacy do you really have. It’s kind of niche audience for now I guess, but I suspect when this function gets merged with RealDoll form factors is when this whole artificial girlfriend will really take off. At that point, when the choice becomes whether you go hunting for a real human girl who is difficult to please, unpredictable, doesn’t always do what you want, doesn’t share all your likes/fetishes, etc VS just getting an AI girlfriend that can be anything you want them to be and won’t say no to anything, I think it’s easy to see the route that many will go.

<insert Futurama ‘Don’t have sex with robots’ video>

[-] UberMentch@lemmy.world 1 points 1 year ago

Brought to you by the space pope

[-] rjs001@lemmygrad.ml -1 points 1 year ago

Actually the thing about it being used to catch nefarious actors could be a good thing. We could use this to catch pedos and drug traffickers.

[-] Ryantific_theory@lemmy.world 1 points 1 year ago

One, I don't think AI RealDolls are gonna be catching drug traffickers lol, and two, there's probably a rather uncomfortable question to be seriously discussed about whether it's wrong for pedos to have an AI relationship doll.

Even if we find it gross, is it wrong if they aren't hurting anyone? That said, it's still secondary to the whole "ignoring all privacy to scan for possible crime" and the debate of whether we should even be treating drugs as a criminal issue instead of a medical one. You're basically arguing that we should secretly put cameras in everyone's homes so we can catch all the nefarious actors. Cameras that are watching all of us every time we have sex.

[-] rjs001@lemmygrad.ml 1 points 1 year ago

No, not cameras in people’s homes but just the website reporting this. The issue with the pedo is that we could use it to look for a pedo that seemed to want the doll to be more like a child (which of course the doll should refuse) but it could use algorithms to predict who is a pedo. If they hadent commited a crime yet we could still keep an eye on them so if they did (via the internet for example) we could catch them easier)

And I don’t think that drug trafficking is ever treated as a medical issue. I’m not suggesting we use the dolls to catch people just doing drugs (even if we think it ought to be illegal) but specifically traffickers. Often people will admit to selling drugs rather openly. And there are many low level drug traffickers who aren’t doing much besides selling them at a street corner. I mean we could use dolls to figure who those figures were.

[-] Ryantific_theory@lemmy.world 1 points 1 year ago

If they hadent commited a crime yet we could still keep an eye on them so if they did (via the internet for example) we could catch them easier)

There are so many dystopian stories based on this concept. You're literally advocating for a police state level of monitoring, so that the government knows so much about you that they can suspect you of crimes that haven't even been committed yet. What happens when investigations start with people flagged for "suspicious" data as determined by black box algorithms that nobody really knows how they handle the data they were trained on.

And drug use is treated as a medical issue in a handful of countries, where instead of making them illegal and pushing them underground, they let people get their heroin tested for purity, get clean needles for free, and shoot up at clinics. It prevents overdoses, ensures vulnerable users are regularly in contact with clinical staff, and makes it easier to help people struggling with addiction. I also fail to see how sex dolls will help catch drug traffickers in a way that would be different from just having everyone's phones or computers spy on them.

[-] rjs001@lemmygrad.ml 1 points 1 year ago

Because having the computers spy on them would be illegal. This is willingly giving your data to them rather then something spying. The servers store the data and they would be used to catch the people. But if you did it from your phone then it would be very hard to do because of legality. If you data is flagged for suspicious activity then you can be investigated. What has a person to fear if they haven’t committed a wrongdoing

[-] Ryantific_theory@lemmy.world 1 points 1 year ago

Bro, how do you think the things would spy on people, if they don't have computers in them? You're just splitting hairs because you think it's moral to spy on everyone through a sextoy with an internet connection instead of any other computing device with an internet connection, just because it isn't illegal yet.

The whole "those with nothing to fear have nothing to hide" bullshit falls apart when used on you. You may not have anything to hide, but I doubt you'd be happy to let law enforcement watch you masturbate, or sleep, or shower, because you don't have anything to hide. Invite them in to record you talking to friends and family to ensure you aren't communicating about crimes. Sure, you might not have anything to "hide" right now, but there's plenty of things you don't want to share, and you never know what the government is going to be like in the future. Imagine an extremist party gets to power and you hold freely recorded views antithetical to their beliefs? Or you or a family member jokes about speeding or shoplifting and now you're flagged as under suspicion for criminal activity, a preferential suspect for any unsolved crimes geographical near them because breaking the law once makes you more likely to break it again.

Silently watching everything people do isn't some zero cost activity. It's people watching you, your kids, your friends, your family, at all moments of their lives and if you don't think it'll be abused then you're out of your mind.

[-] rjs001@lemmygrad.ml 0 points 1 year ago

Because the bot would be the same way as a real girlfriend. Good people would turn in a pedo or drug trafficker and so a good bit ought to do the same. This isn’t spying on someone but emulating that. None of this is about spying on people all the time but just what they tell their girlfriend

[-] x4740N@lemmy.world 4 points 1 year ago

Oh replika the app that suddenly went paywalled for any words deemed horny to exploit the horny of their audience

[-] stagen@feddit.dk 3 points 1 year ago

I've tried a few of these and they quickly lose their appeal. It's definitely not for me and I don't understand how anyone could be fooled.

[-] xc2215x@lemmy.world 3 points 1 year ago

Most likely not a great idea.

[-] tabular@lemmy.world 2 points 1 year ago

Where are they getting the training data from? If Twitter posts then no one will date the "AI" anyway.

The types of people I'd personally want to date probably don't give out their data so easy.

load more comments
view more: next ›
this post was submitted on 23 Jul 2023
98 points (90.8% liked)

Technology

59080 readers
3263 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS