Show transcript
Screenshot of a tumblr post by hbmmaster:
the framing of generative ai as “theft” in popular discourse has really set us back so far like not only should we not consider copyright infringement theft we shouldn’t even consider generative ai copyright infringement
who do you think benefits from redefining “theft” to include “making something indirectly derivative of something created by someone else”? because I can assure you it’s not artists
okay I’m going to mute this post, I’ll just say,
if your gut reaction to this is that you think this is a pro-ai post, that you think “not theft” means “not bad”, I want you to think very carefully about what exactly “theft” is to you and what it is about ai that you consider “stealing”.
do you also consider other derivative works to be “stealing”? (fanfiction, youtube poops, gifsets) if not, why not? what’s the difference? because if the difference is actually just “well it’s fine when a person does it” then you really should try to find a better way to articulate the problems you have with ai than just saying it’s “stealing from artists”.
I dislike ai too, I’m probably on your side. I just want people to stop shooting themselves in the foot by making anti-ai arguments that have broader anti-art implications. I believe in you. you can come up with a better argument than just calling it “theft”.
I think it's quite literally copyright infringement, assuming the models are fed with work from actual artists who typically don't agree to it. Whether copyright should work this way is another matter.
I meant it more in a "hire an artist to work on art for you VS just ask the AI to do it instead" way
even if you’re emulating an art style in particular, that’s not copyright infringement because you can’t copyright an art style. which is good because if you could, that would be awful for a ton of artists
it’s only copyright infringement if you ask an AI to do, say, a picture of mario. but in this case, it’s also copyright infringement if you commission an artist to do it!
Nah.
Training a statistical model with unlicensed work is not the same as a human learning.
Under copyright/IP laws, using a copyrighted work, without license, with the intent of competing with the copyright holder (what virtually all commercial AI models are doing) is not fair use and there is plenty of case law backing that. Whether something is transformative (arguably, training models isn't) doesn't even matter if the infringement is done with the intent of causing material harm to the copyright holder through competition. None of the models out there fit the criteria for fair use.
Something being a tool does not magically remove all liability. This is especially true if the tool is built illegally using unlicensed intellectual property and depends on said unlicensed intellectual property to have any value (literally all major models fit this description).
I suppose the issue is whether you want to see training an AI as equivalent to practicing as a human artist. Considering that AIs are generally made specifically as commercial products, I don't think that's really true, but this is definitely something that can be argued one way or the other.
I don't think it would be considered fine under current laws if an AI-free Adobe Photoshop was shipped with tons of copyrighted art that was scraped off the internet without the artists' approval, even if the users aren't allowed to use it to make commercial works that reproduce Super Mario or w/e 1:1.