view the rest of the comments
Unpopular Opinion
Welcome to the Unpopular Opinion community!
How voting works:
Vote the opposite of the norm.
If you agree that the opinion is unpopular give it an arrow up. If it's something that's widely accepted, give it an arrow down.
Guidelines:
Tag your post, if possible (not required)
- If your post is a "General" unpopular opinion, start the subject with [GENERAL].
- If it is a Lemmy-specific unpopular opinion, start it with [LEMMY].
Rules:
1. NO POLITICS
Politics is everywhere. Let's make this about [general] and [lemmy] - specific topics, and keep politics out of it.
2. Be civil.
Disagreements happen, but that doesn’t provide the right to personally attack others. No racism/sexism/bigotry. Please also refrain from gatekeeping others' opinions.
3. No bots, spam or self-promotion.
Only approved bots, which follow the guidelines for bots set by the instance, are allowed.
4. Shitposts and memes are allowed but...
Only until they prove to be a problem. They can and will be removed at moderator discretion.
5. No trolling.
This shouldn't need an explanation. If your post or comment is made just to get a rise with no real value, it will be removed. You do this too often, you will get a vacation to touch grass, away from this community for 1 or more days. Repeat offenses will result in a perma-ban.
Instance-wide rules always apply. https://legal.lemmy.world/tos/
You make a very similar argument as @Surdon and my answer is the same (in short, my answer to the other comment is longer):
Yes giving everyone access would be a bad idea. I parallel it to controlled substance access, which reduces black-market drug sales.
You do have some interesting details though:
This has been mentioned a few times, mostly with the idea of mixing "normal" children photos with adult porn to generate csam. Is that what you are suggesting too? And do you know if this actually works? I am not familiar with the extent generativ AI is able to combine these sorts of concepts.
This is more or less my expectation too, but I wouldn't count on the research coming out in a few years. There isn't much incentive to do actual research on the topic afaik. There isn't much to be gained because of the probable reaction of the regulators, and much to lose with such a hot topic.
[This comment has been deleted by an automated system]
I didn't know this was a thing tbh. I knew that you could get them to generate adult porn or combine faces with adult porn. Didn't know they could already create realistic csam. I assumed they used the original material to train one of the open models. Well that's even more horrifying.
Didn't even think about that. Exchanging these models will be significantly less risky than exchanging the actual material. Images are being scanned by cloud storage providers and archives with weak passwords are apparently too. But noone is going to execute an AI model just to see if it can or cannot produce csam.
[This comment has been deleted by an automated system]