201
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 15 May 2024
201 points (93.1% liked)
Games
32686 readers
645 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 1 year ago
MODERATORS
That's a very naive perspective though. We're not blaming the guns for gun violence, it's the people, but restricting access to guns is still the proven way to reduce gun incidents. One day when everyone is enlightened enough to not need such restrictions then we can lift them but we're very far from that point, and the same goes for tools like "AI".
you’re gonna have a bad time restricting software
Very easy time if it's about commercial use (well, at least outside of china). Companies need to have licenses for the software they use, they have to obey copyright laws and trademarks, have contracts and permissions for anything they use in their day to day work. It's the same reason why no serious company wants to even touch any competitor's leaked source code when it appears online.
Just because AI tech bros live in a bubble of their own, thinking they can just take and repurpose anything they need, doesn't mean it should be like that - for the most case it isn't and in this case, the law just hasn't caught up with the tech yet.
actual example please not like your other friend Luddite on the other comment
It'd be dead easy, actually. Don't even have to actually ban it: For image generating models, every artist whose work is included in the training data becomes entitled to 5 cents per image in the training data every time a model generates an image, so an artist with 20 works in the model is entitled to a dollar per generated image. Companies offering image generating neural networks would near instantly incur such huge liabilities that it simply wouldn't be worth it anymore. Same thing could apply to text and voice generating models, just per word instead of per image.
disregarding the fact that the model learns and extrapolates from the training data, not copying,
have fun figuring out which model made the image in the first place!