467
submitted 3 months ago by misk@sopuli.xyz to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] areyouevenreal@lemm.ee 4 points 3 months ago* (last edited 3 months ago)

How intelligent it is or isn't is irrelevant. We talk about much dumber programs than AI as being consumers of files and data including things like compilers. Would it not be person use for you to view a picture in a photo viewer or try and edit it in GIMP?

It's not data theft at all unless the courts and law says it is. Ranting on lemmy won't change that fact. Theft is a construct of law.

You can add clauses against use as AI training data to your licence if you wish.

[-] FunnyUsername@lemmy.world -2 points 3 months ago* (last edited 3 months ago)

You can try to equate humans to computers all day, and you can even pass laws that says they're the same thing. That does not make it true. A company using software to profit off data they have not licensed (whether it's public or not does not matter! That is not how copyright law works!) is theft.

Please try to sell DVDs of markiplier's publicaly available YouTube content and tell people how you're allowed to because it's publicaly available.

[-] areyouevenreal@lemm.ee 6 points 3 months ago

I am not equating humans with computers. These businesses are not selling people's data when doing AI training (unlike actual data brokers). You can't say something AI generated is a clone of the original anymore than you can say parody is.

[-] FunnyUsername@lemmy.world -1 points 3 months ago* (last edited 3 months ago)

I absolutely can. Parody is an art form, which is something that can exclusively only be created by human beings. AI is an art laundering service. Not an artist.

The law should reflect that these companies need to be first granted permission to use datasets by the rights holders, and creative commons licenses need to be given an opportunity to opt out of being crawled for these datasets. Anything else is wrong. Machines are not humans. Creative common copyright law was not written with the concept of machines being "consumers". These companies took advantage of the sudden emergence of these models and the delay of law in holding their hunger for data in check. They need to be held accountable for their theft.

[-] areyouevenreal@lemm.ee 4 points 3 months ago* (last edited 3 months ago)

There are already anti-AI licenses out there. If you didn't license your stuff with that in mind that's on you. Deep learning models have been around for a lot longer than GPT 3 or anything that's happened in the current news cycle. They have needed training data for that long too. It was predictable stuff like this would happen eventually, and if you didn't notice in time it's because you haven't been paying attention.

You don't get to dictate what's right and wrong. As far as I am concerned all copyright is wrong and dumb, but the law is what the law is. Obviously not everyone shares my opinion and not everyone shares yours.

Whether an artist is involved or not it's still a transformative use.

this post was submitted on 05 Aug 2024
467 points (96.8% liked)

Technology

59648 readers
1478 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS