204
submitted 8 months ago by 1984@lemmy.today to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] Hestia@lemmy.world 43 points 8 months ago* (last edited 8 months ago)

Read a bit of the court filing, not the whole thing though since you get the gist pretty early on. Jornos put spin on everything, so here's my understanding of the argument:

  1. Musk, who has given money to OpenAI in the past, and thus can legally file a complaint, states that
  2. OpenAI, which is a registered as an LLC, and which is legally a nonprofit, and has the stated goal of benefitting all of humanity has
  3. Been operating outside of its legally allowed purpose, and in effect
  4. Used its donors, resources, tax status, and expertise to create closed source algorithms and models that currently exclusively benefit for-profit concerns (Musk's attorney points out that Microsoft Bing's AI is just ChatGPT) and thus
  5. OpenAI has created a civil tort (a legally recognized civil wrong) wherein
  6. Money given by contributors would not haven been given had the contributors been made aware this deviation from OpenAI's mission statement and
  7. The public at large has not benefited from any of OpenAI's research, and thus OpenAI has abused its preferential tax status and harmed the public

It's honestly not the worst argument.

[-] Shelena@feddit.nl 16 points 8 months ago* (last edited 8 months ago)

I actually agree with this. This technology should be open. I know that there are arguments to keep it closed, like it could be misused, etc. However, I think that all the scary stories about AI are also a way to keep attention away from the fact that if you have a monopoly on it, you have enormous power. This power will grow when the tech is used more and more. If all this power is in the hands of a commercial business (even though they say they aren't), then you know AI is going to be misused to gain money. We do not have clear insight in what they are doing and we have no reason to trust them.

You also know that bad actors, like dictatorial governments will eventually get or develop the technology themselves. So, keeping it closed is not a good way to protect it from that happening. At the same time, you are also keeping it from researchers who could investigate how to use and develop it further to be used responsibly and to the benefit of humanity.

Also, they relied on data generated by people in society who never got any payment or anything for that. So, it is immoral to not share the results with that same people in society openly and instead keeping it closed. I know they used some of my papers. However, I am not allowed to study their model. Seems unfair.

The dangers of AI should be kept at bay using regulation and enforcement by democratically chosen governments, not by commercial businesses or other non-democratic organisations.

[-] 1984@lemmy.today 15 points 8 months ago

I don't want Musk to be right, but I have to admit, it sounds legit.

[-] conciselyverbose@sh.itjust.works 9 points 8 months ago

Yeah, fuck "it's not in the terms of a contract". It's fraud.

You can't advertise yourself as a nonprofit organization for the public good, collect donations under that pretense, then just privatize anything you learn for profit.

People don't donate to for profit companies.

this post was submitted on 02 Mar 2024
204 points (89.2% liked)

Technology

59598 readers
1824 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS