177
submitted 1 year ago by MicroWave@lemmy.world to c/news@lemmy.world

Apps and websites that use artificial intelligence to undress women in photos are soaring in popularity, according to researchers.

In September alone, 24 million people visited undressing websites, according to the social network analysis company Graphika.

Many of these undressing, or “nudify,” services use popular social networks for marketing, according to Graphika. For instance, since the beginning of this year, the number of links advertising undressing apps increased more than 2,400% on social media, including on X and Reddit, the researchers said. The services use AI to recreate an image so that the person is nude. Many of the services only work on women.

These apps are part of a worrying trend of non-consensual pornography being developed and distributed because of advances in artificial intelligence — a type of fabricated media known as deepfake pornography. Its proliferation runs into serious legal and ethical hurdles, as the images are often taken from social media and distributed without the consent, control or knowledge of the subject.

top 50 comments
sorted by: hot top controversial new old
[-] NounsAndWords@lemmy.world 104 points 1 year ago

This is going to be such a culture shift. Kids today are going to grow up in a world where anyone can make any image or video of any person at any time. I have no idea how this will impact sexual health and development or if really it will even be a bad thing. It could lead to more openness and freedom, kind of like growing up with the Internet did. It could also have some really harmful unexpected consequences...kinda like how growing up with the Internet did...

[-] themeatbridge@lemmy.world 58 points 1 year ago

It's teaching kids to keep all of their photos private, like we always should have done from the beginning. Your face, your likeness, your privacy, there are malicious and greedy people out there that want what you have.

Why did we ever start publicly sharing anything?

[-] Boozilla@lemmy.world 33 points 1 year ago

I feel the same way about biometrics. When I tell friends, family, and coworkers about this, they look at me like I'm crazy. You can change your password. You can't change your retinal pattern, fingerprint, etc.

And I don't care how much someone tries to convince me on how securely it's stored inside the phone hardware or the cloud. You are trusting every single coder and engineer who has a hand in designing and maintaining these things. Not to mention hackers who always find a way to breach.

So far I've avoided using any of mine anywhere.

[-] Fleur__@lemmy.world 6 points 1 year ago

I've never thought about this before. One thing I'm curious to hear from your perspective is the idea of a single password being irrelevant. I've had my credit card info stolen before and it's never been a concern of not being able to convince my bank it wasn't me, the amount of corroboratating evidence is just overwhelming. In a world where a single point of failure ,such as a biometric, is irrelevant simply because the amount of information to convincingly recreate a person is more trouble than it's worth. Additionally I've never had a biometric be the last line of defense, somewhere along the line a password is always required.

Idk love to hear thoughts on this admittedly am low key drunk so maybe am saying dumb stuff

[-] uranibaba@lemmy.world 5 points 1 year ago

I have to agree with you. My biggest concern is not if they have my fingerprint because there isn't really much they can do with it. What I am concerned about is how much data they gather (and not just about be, for the same reason, but from groups) and what they can do with it to alter my world view.

load more comments (1 replies)
load more comments (2 replies)
[-] NounsAndWords@lemmy.world 19 points 1 year ago

I think even that is ultimately a lost cause. There are cameras everywhere and there's getting to be more. Facebook is trying to get us all used to having cameras right on our face at all times. Even if you could completely block your face from every camera, how long until the AI equivalent of a police sketch artist just remakes your face from someone's description?

[-] themeatbridge@lemmy.world 7 points 1 year ago

The difference is the source. If someone pulls a screen grab of you from a Target security camera and uses AI to make porn, the employee has a problem because Target has a problem. If you take photos of yourself at the beach and share them via Facebook, then anyone anywhere can end up with that photo and use it for almost anything with no problem because you took it and shared it publicly.

Hopefully this gets people to realize how much they are giving up by sharing via Facebook, and Instagram, and Xitter, and TicTac or whatever. If you want to be social media famous, be famous. But fame comes at a price.

load more comments (1 replies)
[-] leraje@lemmy.blahaj.zone 12 points 1 year ago
[-] themeatbridge@lemmy.world 9 points 1 year ago

I'm gonna be straight with you, I don't even want to click that link.

[-] leraje@lemmy.blahaj.zone 4 points 1 year ago

It's a BBC News story :)

load more comments (1 replies)
load more comments (2 replies)
[-] bioemerl@kbin.social 17 points 1 year ago* (last edited 1 year ago)

They are already doing that shit, back in 2013 it was done with something called bubbling.

https://gizmodo.com/get-anyone-naked-with-the-pic-bubbler-iphone-app-5656093

I remember being in high school and I remember people sharing nude images of each other on those old ass snap a picture cell phones. Or a couple of my cousins going on facebook and rating all the women 1 to 10.

And before people were doing it was computers they were doing it with their imagination.

Same new shit same old story, it's all going to be fine.

[-] Touching_Grass@lemmy.world 11 points 1 year ago* (last edited 1 year ago)

We need to go back to the days of bubble porn and Photoshop. Oooo the humanity

[-] ultranaut@lemmy.world 6 points 1 year ago

I would imagine it's likely to empower all kinds of unhealthy and concerning behaviors. People are fucking weird and if you can use AI to create a kind of virtualized sex slave out of the publicly available data on any person you know, that could be a real problem for society.

[-] NounsAndWords@lemmy.world 8 points 1 year ago

I honestly think something like that is going to be as much or more of a harm for the person making/using it. When the world is just flooded with fake images I think we're gonna get jaded by them very quickly and it will be just another form of (really fucked up) cyberbullying.

For the people making them the harm is more insidious. They're likely to be doing it in secret (because weird/immoral sex stuff) and isolating themselves, and we end up in a weird Futurama Lucy Liu-Bot situation with even more social isolation and further declining mental health.

load more comments (1 replies)
[-] SPRUNT@lemmy.world 37 points 1 year ago

By 2030 we'll have AR glasses with cameras, microphones, speakers, and a built in AI assistant that will digitally remove everyone's clothing in real time.

[-] NounsAndWords@lemmy.world 34 points 1 year ago

We're finally going to learn if the "imagine everyone in their underwear" trick for public speaking really works.

[-] hemko@lemmy.dbzer0.com 7 points 1 year ago

Great tip. Now you're nervous with boner

load more comments (1 replies)
[-] bioemerl@kbin.social 13 points 1 year ago

Hello yes I don't actually know what you look like because my VR glasses are converting everyone into the living image of Danny DeVito

load more comments (13 replies)
[-] daredevil@kbin.social 32 points 1 year ago

Pandora's Box is already opened, unfortunately. The Streisand Effect is only going to make this worse.

[-] QuarterSwede@lemmy.world 17 points 1 year ago

The only real options for response for a celebrity or public figure is 1) say nothing or 2) make light of it by saying something like, “I’m flattered, they made me look better than I do!”

[-] daredevil@kbin.social 17 points 1 year ago

I'd imagine this will also be very problematic for non-celebrities from all sorts of backgrounds as well. The harassment potential is very concerning.

[-] QuarterSwede@lemmy.world 11 points 1 year ago

Agreed. Just need to teach our kids that it literally isn’t the end of the world and how to deal with it.

load more comments (8 replies)
load more comments (1 replies)
[-] guyrocket@kbin.social 26 points 1 year ago

This article seems to imply that this cat can go back in the bag. I'm not at all sure about that.

[-] FaceDeer@kbin.social 7 points 1 year ago

If they put the cat in the bag I can just use one of these handy apps to make the bag disappear again.

load more comments (1 replies)
load more comments (1 replies)
[-] SamsonSeinfelder@feddit.de 21 points 1 year ago

This will get bloody. Some western woman and teenager will running the gauntlet of harassment and embarrassment. Some will even suicide because of those fake photos. In other parts of the world, where father and brothers kill family members for shame, it might be even a bloodbath. Imagine living as a woman under a patriarchate where you get in trouble when a curl of hair is visible. Now imagine there are fake nudes about you while you are surrounded by religious fanatics who have a deep routed history of shame and family honour. Look at Mia Khalifa. Woman will die because of this and the backward archaic anachronistic worldview of some insecure weak man.

[-] FaceDeer@kbin.social 11 points 1 year ago

On the plus side, this will be the death knell for those sorts of extreme cultures. It'll be impossible for them to survive and still have these kinds of hangups in a world where such mortal offences are present everywhere you look.

It'll be a nasty path getting there, unfortunately.

[-] Fleur__@lemmy.world 20 points 1 year ago

They'll just make it illegal to do that lol. Realistically tho what else are we supposed to do, the technology exists. If you were dedicated enough you could already do this with Photoshop. I wouldn't be surprised if there was an example of an artist ruining someones reputation by painting them in a compromising way centuries ago

[-] Meowoem@sh.itjust.works 19 points 1 year ago

Another day another puritan panic

Yes we should ban open source and consumer gfx cards and everything else to stop the possibility that someone might have a sexual thought -in fact all humans should be blinded at birth to avoid this!

Won't someone think of the children!!!!!

[-] GregorGizeh@lemmy.zip 19 points 1 year ago

This panic over fake porn is the wrong response. In fact, encourage it, make it so ubiquitous that there is always fake porn of everyone, everywhere and nobody gives two shits about nude leaks or revenge porn any more.

[-] JGrffn@lemmy.world 7 points 1 year ago

Surely you can see how this also isn't a fitting solution. Just... Go down the age brackets and see how increasingly uncomfortable it all becomes to tolerate this. There's already been cases of AI porn of highschoolers made by highschoolers. We can keep going down the victim age line, or up the perpetrator age line. It gets bad pretty fast regardless of how ubiquitous this might become in the future.

[-] GregorGizeh@lemmy.zip 9 points 1 year ago

There isn’t another answer though. The tech is there, it will only get worse to the point we can’t recognize a fake any more. It makes much more sense to lean into it and make sure before that point is reached that it doesn’t ruin peoples lives any more to have that stuff circulated.

load more comments (1 replies)
load more comments (1 replies)
[-] andrewta@lemmy.world 12 points 1 year ago

Naaaahhhh we didn’t see this coming at all. Golly gee this is a total shock that this is happening.

[-] queermunist@lemmy.ml 11 points 1 year ago

Okay, so we need to get unique codes tattooed onto our genitals. That way if your nudes show up, you can always know for sure if they're real or fake (and, importantly, who tf leaked your nudes)

[-] perviouslyiner@lemm.ee 24 points 1 year ago

Public key in a pubic place?

[-] queermunist@lemmy.ml 12 points 1 year ago

Pubic key, clearly.

Or who knows, maybe nudity will cease to be taboo when everyone can see everyone else naked anyway.

load more comments (1 replies)
[-] snooggums@kbin.social 6 points 1 year ago
[-] themeatbridge@lemmy.world 7 points 1 year ago

I'd be able to tell because there's no way an AI has been trained on my lumpy potato body covered in hair in weird places.

[-] Hyperreality@kbin.social 5 points 1 year ago

Why would you willingly forfeit plausible deniability?

[-] hh93@lemm.ee 4 points 1 year ago

How would you know who did it with such a code?

Also it only works once since you need to reveal the true code to dispute the picture

[-] tegs_terry@feddit.uk 10 points 1 year ago

It'll work in their favour eventually. There won't be any more revenge porn because it can just be dismissed as fake immediately.

[-] paddirn@lemmy.world 10 points 1 year ago

Combining this with AR goggles, you really will be able to see everyone else naked when you’re giving a speech.

[-] Yoz@lemmy.world 9 points 1 year ago

Disgusting! Where are people downloading this app from so that I can avoid the website or that app store?

load more comments (1 replies)
[-] CleoTheWizard@lemmy.world 6 points 1 year ago

I’ve made this comment other places, I’ll make it here. This tech changes much less than people think. If anything, this will protect people from leaking of nudes because people will assume it’s probably fake.

All this is is an advanced form of fantasy that has existed ever since photoshop has existed. And importantly, we will deal with these nude photos the exact same way we deal with real nudes.

Meaning, if you catch people distributing fake photos of their classmates, the punishment should be the same as if they were real. And they need to be severe.

The reason this is a problem and I’m concerned for young women is that protections for sexual harassment online have already been abysmal. So this will make things worse and since we don’t protect women very well in the US, I expect major issues. Basically, the problems aren’t new, but our lack of action will make this awful. Treated correctly, this is a non-issue and these photos should be kept in private.

load more comments
view more: next ›
this post was submitted on 09 Dec 2023
177 points (91.2% liked)

News

23448 readers
767 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS