99
submitted 1 week ago by Pro@reddthat.com to c/technology@lemmy.world

Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

top 50 comments
sorted by: hot top controversial new old
[-] some_guy@lemmy.sdf.org 30 points 1 week ago

For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

That's just on it's face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That's not pedophilia. It's wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

It shouldn't be treated the same as when an adult man generates it; there should be nuance. I'm not saying it's ok for a thirteen year old to generate said content: I'm saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

I'm so glad I went through puberty at a time when this kind of shit wasn't available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

[-] AA5B@lemmy.world 7 points 1 week ago

In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?

load more comments (2 replies)
[-] lka1988@lemmy.dbzer0.com 4 points 1 week ago* (last edited 1 week ago)

As a father of teenage girls, I don't necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

[-] some_guy@lemmy.sdf.org 9 points 1 week ago

Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

load more comments (4 replies)
[-] seralth@lemmy.world 9 points 1 week ago

There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.

Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.

[-] lka1988@lemmy.dbzer0.com 7 points 1 week ago

ruining the life of a 13 year old boy for the rest of his life with no recourse

And what about the life of the girl this boy would have ruined?

This is not "boys will be boys" shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

I don't think it's unreasonable to expect an equivalent punishment that has the potential to ruin his life.

[-] Vinstaal0@feddit.nl 5 points 1 week ago

It is not abnormal to see different punishment for people under the age of 18. Good education about sex and what sexual assault does with their victims (same with guns, drugs including alcohol etc).

You can still course correct the behaviour of a 13 year old. There is also a difference between generating the porn and abusing it by sharing it etc.

The girls should be helped and the boys should be punished, but mainly their behaviour needs to be correcte

[-] DancingBear@midwest.social 4 points 1 week ago

Fake pictures do not ruin your life… sorry…

Our puritanical / 100% sex culture is the problem, not fake pictures…

load more comments (6 replies)
[-] Agent641@lemmy.world 3 points 1 week ago

Punishment for an adult man doing this: Prison

Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.

[-] GraniteM@lemmy.world 5 points 1 week ago

13 year old: “I'll just take the death penalty, thanks."

[-] dinckelman@lemmy.world 25 points 1 week ago

Lawmakers are grappling with how to address ...

Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

[-] Jax@sh.itjust.works 8 points 1 week ago

Oh I just assumed that every Conservative jerks off to kids

load more comments (4 replies)
[-] shalafi@lemmy.world 4 points 1 week ago

A 99-1 vote to drop the anti AI regulation is hardly the government voting against. The Senate smashed that shit hard and fast.

[-] LovableSidekick@lemmy.world 3 points 1 week ago* (last edited 1 week ago)

Expecting people to know about that 99-1 vote might be misplaced optimism, since it hasn't been made into a meme yet.

load more comments (2 replies)
[-] 2ugly2live@lemmy.world 13 points 1 week ago

God I'm glad I'm not a kid now. I never would have survived.

load more comments (1 replies)
[-] vane@lemmy.world 10 points 1 week ago

Maybe let's assume all digital images are fake and go back to painting. Wait... what if children start painting deepfakes ?

[-] Daft_ish@lemmy.dbzer0.com 10 points 1 week ago

Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

load more comments (2 replies)

probably because there's a rapist in the white house.

[-] aceshigh@lemmy.world 3 points 1 week ago

To add to that. I live in a red area and since the election I’ve been cat called much more. And it’s weird too, cus I’m middle aged…. I thought I’d finally disappear…

the toxic manosphere/blogosphere/whatever it's called has done so much lifelong damage

load more comments (1 replies)
[-] JohnEdwa@sopuli.xyz 6 points 1 week ago
[-] argl@feddit.org 3 points 1 week ago

Can't afford this much cheese today to find just the right slice for every bikini photo...

[-] wewbull@feddit.uk 5 points 1 week ago

Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

I would categorise it as sexual harassment, not abuse. Still serious, but a different level

[-] lath@lemmy.world 8 points 1 week ago

Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

[-] wewbull@feddit.uk 3 points 1 week ago

Disagree. Not CSAM when no abuse has taken place.

That's my point.

[-] Zak@lemmy.world 3 points 1 week ago

I think generating and sharing sexually explicit images of a person without their consent is abuse.

That's distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I'm morally uncomfortable criminalizing an act that has no victim.

load more comments (1 replies)
[-] lath@lemmy.world 2 points 1 week ago

There's a thing that was happening in the past. Not sure it's still happening, due to lack of news about it. It was something called "glamour modeling" I think or an extension of it.

Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

Nothing untoward directly happened to the children. They weren't physically abused. They were treated as regular fashion models. And yet, it's still csam. Why? Because of the intention behind making those pictures.

The intention to exploit.

load more comments (7 replies)
[-] LostXOR@fedia.io 3 points 1 week ago

Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

load more comments (4 replies)
[-] LadyAutumn@lemmy.blahaj.zone 5 points 1 week ago* (last edited 1 week ago)

Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

[-] FishFace@lemmy.world 3 points 1 week ago

It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.

This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.

[-] LadyAutumn@lemmy.blahaj.zone 3 points 1 week ago

It's sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.

Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don't understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.

Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It's LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It's criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.

load more comments (15 replies)
load more comments (10 replies)
load more comments (5 replies)
[-] Walk_blesseD@piefed.blahaj.zone 4 points 1 week ago

Jfc the replies here are fucking rancid. Lemmy is full of sweaty middle aged blokes in tech who hate it when anyone tells them that grown men who pursue teenage girls who have just reached an arbitrary age are fucking creeps, so of course they're here encouraging the next generation of misogynist scum by defending this shit, too.
And men (pretend to) wonder why we distrust them.

Ngl, I'm only leaving reply notifs on for this one to work on my blocklist.

[-] atomicorange@lemmy.world 3 points 1 week ago

Yeah there’s some nasty shit here. Big yikes, Lemmy.

[-] MagicShel@lemmy.zip 3 points 1 week ago* (last edited 1 week ago)

The only defense is to train AI to draw guys with micropenises. As long as kids being kids is a defense for this shit (and to be fair, kids are pretty fucking stupid and need the freedom to grow out of that) rule makers have no power here. At least insofar as the AI to do this can be run locally on a potato.

load more comments (5 replies)
[-] RememberTheApollo_@lemmy.world 2 points 1 week ago* (last edited 1 week ago)

I’m sure the laws will focus on protecting IP - specifically that of AI companies or megacorps, the famous and powerful, but not the small creators of content or the rabble negatively affected by AI abuse.

The rest of us will have to suffer through presenting whatever damaging and humiliating video to a court. If you can even afford a lawyer to do so. Then be offered a judgement that probably won’t be paid or won’t cover the damage done by an image that will never be able to be erased from the internet. Those damages could include the suicide of young people bullied and humiliated by such deepfakes.

load more comments
view more: next ›
this post was submitted on 02 Jul 2025
99 points (98.1% liked)

Technology

72764 readers
1640 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS