850
submitted 2 years ago by L4s@lemmy.world to c/technology@lemmy.world

Thousands of authors demand payment from AI companies for use of copyrighted works::Thousands of published authors are requesting payment from tech companies for the use of their copyrighted works in training artificial intelligence tools, marking the latest intellectual property critique to target AI development.

you are viewing a single comment's thread
view the rest of the comments
[-] FontMasterFlex@lemmy.world 19 points 2 years ago

So what's the difference between a person reading their books and using the information within to write something and an ai doing it?

[-] Saneless@lemmy.world 12 points 2 years ago

Because AIs aren't inspired by anything and they don't learn anything

[-] r1veRRR@feddit.de 13 points 2 years ago

So uninspired writing is illegal?

[-] dan@lemm.ee 8 points 2 years ago

No but a lazy copy of someone else’s work might be copyright infringement.

[-] Odusei@lemmy.world 3 points 2 years ago

So when does Kevin Costner get to sue James Cameron for his lazy copy of Dances With Wolves?

[-] Telodzrum@lemmy.world 4 points 2 years ago

Avatar is not Dances with Wolves. It's Ferngully.

[-] dan@lemm.ee 3 points 2 years ago

Idk, maybe. There are thousands of copyright infringement lawsuits, sometimes they win.

I don’t necessarily agree with how copyright law works, but that’s a different question. Doesn’t change the fact that sometimes you can successfully sue for copyright infringement if someone copies your stuff to make something new.

[-] tenitchyfingers@lemmy.world 2 points 2 years ago

Why not? Hollywood is full to the brim with people suing for copyright infringement. And sometimes they win. Why should it be different for AI companies?

[-] lily33@lemmy.world 4 points 2 years ago* (last edited 2 years ago)

Language models actually do learn things in the sense that: the information encoded in the training model isn't usually* taken directly from the training data; instead, it's information that describes the training data, but is new. That's why it can generate text that's never appeared in the data.

  • the bigger models seem to remember some of the data and can reproduce it verbatim; but that's not really the goal.
[-] Chailles@lemmy.world 3 points 2 years ago

What does inspiration have to do with anything? And to be honest, humans being inspired has led to far more blatant copyright infringement.

As for learning, they do learn. No different than us, except we learn silly abstractions to make sense of things while AI learns from trial and error. Ask any artist if they've ever looked at someone else's work to figure out how to draw something, even if they're not explicitly looking up a picture, if they've ever seen a depiction of it, they recall and use that. Why is it wrong if an AI does the same?

[-] vrighter@discuss.tchncs.de 2 points 2 years ago

the person bought the book before reading it

[-] FontMasterFlex@lemmy.world -2 points 2 years ago

not if i checked it out from a library. a WORLD of knowledge at your fingertips and it's all free to me, the consumer. So who's to say the people training the ai didn't check it out from a library, or even buy the books they are using to train the ai with? would you feel better about it had they purchased their copy?

[-] charonn0@startrek.website -1 points 2 years ago* (last edited 2 years ago)

Large language models can only calculate the probability that words should go together based on existing texts.

[-] mayo@lemmy.world 3 points 2 years ago

Isn't this correct? What's missing?

Let's ask chatGPT3.5:

Mostly accurate. Large language models like me can generate text based on patterns learned from existing texts, but we don't "calculate probabilities" in the traditional sense. Instead, we use statistical methods to predict the likelihood of certain word sequences based on the training data.

[-] charonn0@startrek.website 1 points 2 years ago

"Mostly accurate" is pretty good for an anonymous internet post.

[-] BakonGuy@lemmy.world 1 points 2 years ago

I don't see how "calculate the probability" and "predict the likelihood" are different. Seems perfectly accurate to me.

[-] mayo@lemmy.world 1 points 2 years ago

I thought so too so I'm still confused about the votes. Oh well

[-] tenitchyfingers@lemmy.world -1 points 2 years ago

A person is human and capable of artistry and creativity, computers aren’t. Even questioning this just means dehumanizing artists and art in general.

this post was submitted on 26 Jul 2023
850 points (96.4% liked)

Technology

61833 readers
1717 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS