5
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 04 Apr 2025
5 points (100.0% liked)
Technology
70083 readers
2365 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
Child Sexual Abuse Material is abhorrent because children were literally abused to create it.
AI generated content, though disgusting, is not even remotely on the same level.
The moral panic around AI that leads to implying that these things are the same thing is absurd.
Go after the people filming themselves literally gang raping toddlers, not the people typing forbidden words into an image generator.
Don't dilute the horror of the production CSAM by equating it to fake pictures.
Yes at a cursory glance that's true. AI generated images don't involve the abuse of children, that's great. The problem is what the follow-on effects of this is. What's to stop actual child abusers from just photoshopping a 6th finger onto their images and then claiming that it's AI generated?
AI image generation is getting absurdly good now, nearly indistinguishable from actual pictures. By the end of the year I suspect they will be truly indistinguishable. When that happens, how do you tell which images are AI generated and which are real? How do you know who is peddling real CP and who isn't if AI-generated CP is legal?
What's the follow on effect from making generated images illegal?
Do you want your freedom to be at stake where the question before the Jury is "How old is this image of a person (that doesn't exist?)". "Is this fake person TOO child-like?"
You won't be able to tell, we can assume that this is a given.
So the real question is:
Who are you trying to arrest and put in jail and how are you going to write that difference into law so that innocent people are not harmed by the justice system?
To me, the evil people are the ones harming actual children. Trying to blur the line between them and people who generate images is a morally confused position.
There's a clear distinction between the two groups and that distinction is that one group is harming people.
Although that's true, such material can easily be used to groom children which is where I think the real danger lies.
I really wish they had excluded children in the datasets.
You can't really put a stop to it anymore but I don't think it should be something that's normalized and accepted just because there isn't a direct victim anymore. We are also talking about distribution here and not something being done in private at home.
This literally makes no sense.
Kids will do things if they see other children doing it in pictures and videos. It's easier to normalize sexual behavior with cp then without.
This sounds like you're searching really hard for a reason to justify banning it. Pretty tenuous "what if" there.
Like, a dildo could hypothetically be used to sexualize a child. Should we ban dildos?
It's so vague it could apply to anything.