341

A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

you are viewing a single comment's thread
view the rest of the comments
[-] eatthecake@lemmy.world 21 points 7 months ago

The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you'll get a whole lot of complex PTSD instead.

[-] stephen01king@lemmy.zip 21 points 7 months ago

People used to think their lives are over if they were caught alone with someone of the opposite sex they're not married to. That is no longer the case in western countries due to normalisation.

The thing that makes them want to die is societal pressure, not the act itself. In this case, if societal pressure from having fake nudes of yourself spread is removed, most of the harm done to people should be neutralised.

[-] too_much_too_soon@lemmy.world 5 points 7 months ago* (last edited 7 months ago)

Agreed.

"I've been in HR since '95, so yeah, I'm old, lol. Noticed a shift in how we view old social media posts? Those wild nights you don't remember but got posted? If they're at least a decade old, they're not as big a deal now. But if it was super illegal, immoral, or harmful, you're still in trouble.

As for nudes, they can be both the problem and the solution.

To sum it up, like in the animate movie 'The Incredibles': 'If everyone's special, then no one is.' If no image can be trusted, no excuse can be doubted. 'It wasn't me' becomes the go-to, and nobody needs to feel ashamed or suicidal over something fake that happens to many.

Of course, this is oversimplifying things in the real world but society will adjust. People won't kill themselves over this. It might even be a good thing for those on the cusp of AI and improper real world behaviours - 'Its not me. Its clearly AI, I would never behave so outrageously'.

[-] eatthecake@lemmy.world -4 points 7 months ago

The thing that makes them want to die is societal pressure, not the act itself.

That's an assumption that you have no evidence for. You are deciding what feelings people should have by your own personal rules and completely ignoring the people who are saying this is a violation. What gives you the right to tell people how they are allowed to feel?

this post was submitted on 29 Mar 2024
341 points (93.4% liked)

Technology

59454 readers
1727 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS