429
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 18 Jan 2024
429 points (95.2% liked)
Technology
59670 readers
2113 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
Creating fake child porn of real people using things like Photoshop is already illegal in the US, I don't see why new laws are required?
Well those laws clearly don't work. So we should make new laws! Ones that DEFINITELY WILL work! And if they don't, well I guess we just need more laws until we find ones that do.
Since we need a rule explicitly for AI related cases, even though it's already covered by others, lets ensure that we also make a 100 page law for if the material is explicitly made in Photoshop, and also another 80 pages if it was made in Gimp. If you use MS Paint to do it, we need a special 200 page law that makes the punishment even harsher, because damn you got skillz and need to be punished more.
deleted
No, I'm not criticizing the bill's content. If you don't enforce laws, new ones won't work either. The new ones are, at best, an opportunity for people to huff and puff and pat themselves on the back at the cost of actual victims. At worst, it's smoke and mirrors for what the new law actually does.
deleted
This is not at all about protecting children. That's just manipulation. In truth, kids are more likely to prosecuted than protected by this bill.
There are already laws that could be used against teen bullies but it's rarely done. (IMHO it would create more harm than good, anyway.)
This is part of an effort to turn the likenesses of people into intellectual property. Basically, it is about more money for the rich and famous.
This bill would even apply to anyone who shares a movie with a sex scene in it. It's enough that the "depiction" is "realistic" and "created or altered using digital manipulation". Pretty much any photo nowadays, and certainly any movie, can be said to "altered using digital manipulation". There's no mention of age, deception, AI, or anything that the PR bullshit suggests.
Regulatory capture. OpenAI wants to kick down the ladder