Ai cp, they found AI generated cp that had been generated on their service...
Explicit fakes makes it sound less bad.
They were allowing AI cp to be made.
Ai cp, they found AI generated cp that had been generated on their service...
Explicit fakes makes it sound less bad.
They were allowing AI cp to be made.
Is “CP” so you don’t get flagged, or is it for sensitivity.
I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.
FYI, the current accepted term is csam. Children sexual abuse material. The reason why CP is wrong is that porn implies, or should imply, that there's consent on the sexual act, and children cannot consent.
You are right, it's a disgusting merger exactly because it implies something that's absolutely incorrect and wrong.
Very true, thanks for your sensitivity @dumbass
It's pronounced "doo mah."
Wow so its from the duh region in france, here I thought it was just sparkling dumbass
Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.
While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.
The biggest issue with this line of thinking is, how do you prove it's CP without a victim. I suppose at a certain threshold it becomes obvious, but that can be a very blurry line (there was a famous case where a porn star had to be flown to a court case to prove the video wasn't CP, but can't find the link right now).
So your left with a crime that was committed with no victim and no proof, which can be really easy to abuse.
Edit: This is the case I was thinking of - https://nypost.com/2010/04/24/a-trial-star-is-porn/
What the fuck is AI being trained on to produce the stuff?
Pictures of clothed children and naked adults.
Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.
Well, that's somewhat reassuring.
Still reprehensible that it's being used that way, of course.
Given the "we spared no expense" attitude to the rest of the data these things are trained on, I fear that may be wishful thinking...
AI CP seems like a promising way to destroy demand for the real thing. How many people would risk a prison sentence making or viewing the real thing when they could push a button and have a convincing likeness for free with no children harmed? Flood the market with cheap fakes and makers of the real thing may not find it profitable enough to take the risk.
Fun fact it's already illegal. If it's indistinguishable from the real thing it's a crime.
This is a most excellent place for technology news and articles.