I had 85+ images of my art used to train AI. I think the best solution is for the current AI image training sets to be cleared and rebuilt on copyright free and opt in only content. Similar to stock photography where artists can decide for themselves if they want minimal compensation to contribute their art to the training set. This would be necessary because once the systems have been trained on an image it's in the memory. So the only way to respect the rights of the artists after the fact is to wipe the computers and start image generation all over again but ethically. I have linked on my Mastodon https://www.youtube.com/live/uoCJun7gkbA?feature=share a senate hearing on the issue in which the lawyer from Universal Music perfectly pointed out "it'd be hard to opt out if you don't know what has been opted in" . Additionally, this isn't just an artist issue the training set includes photos from medical records, schools, and personal photos. Basically if you've ever posted a photo on the internet there's a chance it's in the training set. "have I been trained" is a website where you can see what is included and opt out (though as mentioned earlier that's not a good solution) I spoke to a prominent IP lawyer in Chicago (before the class action lawsuits were public) and he pointed out that they didn't have the right to reproduce my artwork into their training set. Their actions have been likened to a smoothie shop. They have the storefront and the blenders but they stole all the ingredients. After it's blended you may not ALWAYS be able to recognize the strawberries BUT we know they didn't pay for the fruit. It was stolen for their profit. Why should I be forced to provide the core product of my business to develop the core product of another (for-profit) business?? The senate hearing linked above includes many other important and valid points. Myself and many other artists I know aren't against AI. I love tech and think it's really fun and can be helpful, it just needs to be done ethically. I have a lot more I could add to this, hahahaha
I had 85+ images of my art used to train AI. I think the best solution is for the current AI image training sets to be cleared and rebuilt on copyright free and opt in only content. Similar to stock photography where artists can decide for themselves if they want minimal compensation to contribute their art to the training set. This would be necessary because once the systems have been trained on an image it's in the memory. So the only way to respect the rights of the artists after the fact is to wipe the computers and start image generation all over again but ethically. I have linked on my Mastodon https://www.youtube.com/live/uoCJun7gkbA?feature=share a senate hearing on the issue in which the lawyer from Universal Music perfectly pointed out "it'd be hard to opt out if you don't know what has been opted in" . Additionally, this isn't just an artist issue the training set includes photos from medical records, schools, and personal photos. Basically if you've ever posted a photo on the internet there's a chance it's in the training set. "have I been trained" is a website where you can see what is included and opt out (though as mentioned earlier that's not a good solution) I spoke to a prominent IP lawyer in Chicago (before the class action lawsuits were public) and he pointed out that they didn't have the right to reproduce my artwork into their training set. Their actions have been likened to a smoothie shop. They have the storefront and the blenders but they stole all the ingredients. After it's blended you may not ALWAYS be able to recognize the strawberries BUT we know they didn't pay for the fruit. It was stolen for their profit. Why should I be forced to provide the core product of my business to develop the core product of another (for-profit) business?? The senate hearing linked above includes many other important and valid points. Myself and many other artists I know aren't against AI. I love tech and think it's really fun and can be helpful, it just needs to be done ethically. I have a lot more I could add to this, hahahaha