366
submitted 3 weeks ago by db0@lemmy.dbzer0.com to c/news@lemmy.world

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

top 50 comments
sorted by: hot top controversial new old
[-] ReallyActuallyFrankenstein@lemmynsfw.com 107 points 3 weeks ago* (last edited 3 weeks ago)

It's hard to have a nuanced discussion because the article is so vague. It's not clear what he's specifically been charged with (beyond "obscenity," not a specific child abuse statute?). Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

I completely get the "lock them all up and throw away the key" visceral reaction - I feel that too, for sure - but this is a much more difficult question. There are porn actors over 18 who look younger, do the laws outlaw them from work that would be legal for others who just look older? If AI was trained exclusively on those over-18 people, would outputs then not be CSAM even if the images produced features that looked under 18?

I'm at least all for a "fruit of the poisoned tree" theory - if AI model training data sets include actual CSAM then they can and should be made illegal. Deepfaking intentionally real under 18 people is also not black and white (looking again to the harm factor), but also I think it can be justifiably prohibited. I also think distribution of completely fake CSAM can be arguably outlawed (the situation here), since it's going to be impossible to tell AI from real imagery soon and allowing that would undermine enforcement of vital anti-real-CSAM laws.

The real hard case is producing and retaining fully fake people and without real CSAM in training data, solely locally (possession crimes). That's really tough. Because not only does it not directly hurt anyone in its creation, there's a possible benefit in that it diminishes the market for real CSAM (potentially saving unrelated children from the abuse flowing from that demand), and could also divert the impulse of the producer from preying on children around them due to unfulfilled desire.

Could, because I don't think there's studies that answers whether those are true.

[-] mpa92643@lemmy.world 28 points 3 weeks ago

I mostly agree with you, but a counterpoint:

Downloading and possession of CSAM seems to be a common first step in a person initiating communication with a minor with the intent to meet up and abuse them. I've read many articles over the years about men getting arrested for trying to meet up with minors, and one thing that shows up pretty often in these articles is the perpetrator admitting to downloading CSAM for years until deciding the fantasy wasn't enough anymore. They become comfortable enough with it that it loses its taboo and they feel emboldened to take the next step.

CSAM possession is illegal because possession directly supports creation, and creation is inherently abusive and exploitative of real people, and generating it from a model that was trained on non-abusive content probably isn't exploitative, but there's a legitimate question as to whether we as a society decide it's associated closely enough with real world harms that it should be banned.

Not an easy question for sure, and it's one that deserves to be answered using empirical data, but I imagine the vast majority of Americans would flatly reject a nuanced view on this issue.

[-] MagicShel@programming.dev 30 points 3 weeks ago* (last edited 3 weeks ago)

The problem is empirical data cannot be morally or ethically found. You can't show a bunch of people porn and then make a statistical observation of whether those shown child porn are more likely to assault children. So we have to go forward without that data.

I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even "ethically produced CSAM."

[-] usualsuspect191@lemmy.ca 27 points 3 weeks ago

I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even "ethically produced CSAM

Can we look at trends between consenting adults (who are likely watching porn of real people by the way) as an indicator of what pedophiles will do? I'm not so sure. It's not like step sibling sex is suddenly through the roof now with it being the "trend" in porn.

Looking specifically at fake rape porn maybe and seeing if it increases rates of rape in the real world might be a better indicator.

load more comments (2 replies)
load more comments (3 replies)
[-] HelixDab2@lemm.ee 17 points 3 weeks ago

CSAM possession is illegal because possession directly supports creation

To expound on this: prior to this point, the creation of CSAM requires that children be sexually exploited. You could not have CSAM without children being harmed. But what about when no direct harms have occurred? Is lolicon hentai 'obscene'? Well, according to the law and case law, yes, but it's not usually enforced. If we agree that drawings of children engaged in sexual acts aren't causing direct harm--that is, children are not being sexually abused in order to create the drawings--then how much different is a computer-generated image that isn't based off any specific person or event? It seem to me that, whether or not a pedophile might decide that they eventually want more than LLM-generated images is not relevant. Treating a future possibility as a foregone conclusion is exactly the rationale behind Reefer Madness and the idea of 'gateway' drugs.

Allow me to float a second possibility that will certainly be less popular.

Start with two premises: first, pedophilia is a characteristic that appears to be an orientation. That is, a true pedophile--a person exclusively sexually attracted to pre-pubescent children--does not choose to be a pedophile, any more than a person chooses to be gay. (My understanding is that very few pedophiles are exclusively pedophilic though, and that many child molesters are opportunistic sexual predators rather than being pedophiles.) Secondly, the rates of sexual assault appear to have decreased as pornography availability has increased. So the question I would have is, would wide availability of LLM-generated CSAM--CSAM that didn't cause any real, direct harm to children--actually decrease rates of child sexual assault?

load more comments (7 replies)
load more comments (7 replies)
[-] snooggums@midwest.social 20 points 3 weeks ago

Even worse, you don't need CSAM to start with. If a learning model has regular porn and nude reference model photography of people under 18 that are used for drawing anatomy, then they have enough information to combine the two. Hell, it probably doesn't even need the people under 18 to actually be nude.

Hell, society tends to assume any nudity inder 18 to be CSAM anyway, because someone could see it that way.

load more comments (8 replies)
[-] Nollij@sopuli.xyz 85 points 3 weeks ago

This creates a significant legal issue - AI generated images have no age, nor is there consent.

The difference in appearance between age 16 and 18 is minimal, but the legal difference is immense. This is based entirely on a concept that cannot apply.

How do you define what's depicting a fictional child? Especially without including real adults? I've met people who believe that preferring a shaved pubic area is pedophilia. This is even though the vast majority of adult women do so. On the flip side, teenagers from the 70s and 80s would be mistaken for 40+ today.

Even the extremes aren't clear. Adult star "Little Lupe", who was 18+ in every single appearance, lacked most secondary sex characteristics. Experts testified in court that she could not possibly be an adult. Except she was, and there's full documentation to prove it. Would AI trained exclusively on her work be producing CSAM?

load more comments (11 replies)
[-] DmMacniel@feddit.org 77 points 3 weeks ago

I don't see how children were abused in this case? It's just AI imagery.

It's the same as saying that people get killed when you play first person shooter games.

Or that you commit crimes when you play GTA.

[-] timestatic@feddit.org 34 points 3 weeks ago

Then also every artist creating loli porn would have to be jailed for child pornography.

load more comments (8 replies)
load more comments (120 replies)
[-] jaggedrobotpubes@lemmy.world 63 points 2 weeks ago

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go "ok, itch scratched", and tank the demand for the real stuff.

Depending on which way it goes, it could be massively helpful for protecting kids. I just don't have a sense for what the effect would be, and I've never seen any experts weigh in.

[-] damnedfurry@lemmy.world 33 points 2 weeks ago

Do we know that AI child porn is bad? I could believe it would get them in the mood for the real thing and make them do it more, and I could believe it would make them go “ok, itch scratched”, and tank the demand for the real stuff.

From bits/articles I've seen here and there over the years about other things that are kind of in the same category (porn comics with child characters in them, child-shaped sex dolls), the latter seems to be more the case.

I'm reminded of when people were arguing that when Internet porn became widespread, the incidence of rape would go through the roof. And then literally the opposite happened. So...that pushes me toward hypothesizing that the latter is more likely to be the case, as well.

[-] PhilMcGraw@lemmy.world 22 points 2 weeks ago

In Australia cartoon child porn is enforced in the same way as actual child porn. Not that it answers your question but it's interesting.

I'd imagine for your question "it depends", some people who would have acted on their urges may get their jollies from AI child porn, others who have never considered being pedophiles might find the AI child porn (assuming legal) and realise it's something they were into.

I guess it may lower the production of real child porn which feels like a good thing. I'd hazard a guess that there are way more child porn viewers than child abusers.

load more comments (3 replies)
[-] Thespiralsong@lemmy.world 16 points 2 weeks ago

I seem to remember Sweden did a study on this, but I don't really want to google around to find it for you. Good luck!

load more comments (16 replies)
[-] BonesOfTheMoon@lemmy.world 56 points 3 weeks ago

Could this be considered a harm reduction strategy?

Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?

I've read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it's such a problem.

[-] RandomlyNice@lemmy.world 38 points 2 weeks ago

Many years ago (about 25) I read an article in a newspaper (idk the name, but it may have been the The Computer Paper, which is archived on line someplace}. This article noted that a study had been commissioned to show that cp access increases child abuse. The study seemed to show the opposite.

Here's the problem with even AI generated cp: It might lower abuse in the beginning, but with increased access it would 'normalise' the perception of such conduct. This would likely increase abuse over time, even involving persons who may not have been so inclined otherwise.

This is all a very complex. A solution isn't simple. Shunning things in anyway won't help though, and that seems to be the current most popular way to deal with the issue.

[-] Facebones@reddthat.com 25 points 2 weeks ago

Actual pedophiles (a lot of CSA is abuse of power, not pedophilia - though to be clear fuck abusers either way) have a high rate of suicidal ideation because they think its as fucked up as everyone else. Of course we can't just say "sure AI material is legal now" but I could imagine a regulated system accessed via doctors akin to how controlled substances work.

People take this firm "kill em all" stance but these people just feel the way they do same as I do towards women or a gay man feels toward men. It just is what it is - we all generally agree gay isnt a choice and this is no different. As long as they dont act on it, I think we should be sympathetic and be open to helping them live a less tortured life.

I'm not 100% saying this is how we do it, but we should be open to exploring the issue instead of full stop demonization.

load more comments (8 replies)
[-] Cryophilia@lemmy.world 18 points 2 weeks ago

"Normalized" violent media doesn't seem to have increased the prevalence of real world violence.

load more comments (1 replies)
load more comments (1 replies)
load more comments (15 replies)
[-] Stern@lemmy.world 39 points 3 weeks ago

Lolicon fans in absolute shambles.

load more comments (1 replies)
[-] RangerJosie@lemmy.world 38 points 2 weeks ago

Hey, remember that terrible thing everyone said would happen?

It's happening.

load more comments (1 replies)
[-] hexdream@lemmy.world 26 points 2 weeks ago

If this thread (and others like it) have taught me aulnything is that facts be damned, people are opinionated either way. Nuance means nothing and it's basically impossible to have a proper discussion when it comes to wedge issues or anything that can be used to divide people. Even if every study 100% said Ai generated csam always led to a reduction in actual child harm and reduced recidivism and never needed any actual real children to be used as training material, the comments would still pretty much look the same. If the studies showed the exact opposite, the comments would also be the same. Welcome to the internet. I hope you brought aspirin.

load more comments (10 replies)
[-] recapitated@lemmy.world 24 points 3 weeks ago* (last edited 3 weeks ago)

To be clear, I am happy to see a pedo contained and isolated from society.

At the same time, this direction of law is something that I don't feel I have the sophistication to truly weigh in on, even though it invokes so many thoughts for me.

I hope we as a society get this one right.

load more comments (1 replies)
[-] Mubelotix@jlai.lu 19 points 2 weeks ago

It's not really children on these pics. We can't condemn people for things that are not illegal yet

load more comments (8 replies)
load more comments
view more: next ›
this post was submitted on 26 Aug 2024
366 points (97.2% liked)

News

22851 readers
3474 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS