502
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 14 Jul 2023
502 points (97.7% liked)
Asklemmy
43965 readers
1198 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
Github Copilot is worth the money. I've had it finish out functions for me after just a few lines. There's usually an error or two, but the consistency with which it can predict what I'm doing or trying to do is pretty impressive.
Copilot was trained on copylefted code while itself being closed. What was brought to attention by @ralC@lemmy.fmhy.ml isn't efficacy, but Microsoft's lack of ethics and social responsibility when it comes to their bottom line.
I honestly don't have a problem with that. Everything that it was trained on is publicly-available/open-source code, and I'm not aware of any license that requires you to distribute your modifications if you don't make modified binaries publicly available, not even GPL. And even then, you're only required to make available the code that was modified, not related code. And I don't even think that situation would apply in this case, since nothing was modified, it was just ingested as training data. Copilot read a book, it didn't steal a book from the library and sell it with its name pasted over the original author's.
This isn't really any different of a situation than a closed-source Android app using openssl or libcurl or whatever. Just because those open-source libraries were employed in the making of the app doesn't mean that the developer must release the source for that app, and it doesn't make them a bad person for trying to make money from selling that app. Even Stallman is on board with selling software.
And even if you take all that off the table, you're free to do the exact same thing and make a competitor. Microsoft didn't make their own language model, they're using a commercially-available model developed by OpenAI. There's literally nothing stopping anyone else from doing this as well and making a competing service called "Programming Pal" and making their code open-source. In fact, it's already been done with FauxPilot and CodeGeex and the like.
So yeah, I really don't have a problem with it. This ended up a lot longer than I had originally thought it would, sorry for the novel.
I'm not going to reinvent the wheel here when people more invested in the topic than myself, including the Software Freedom Conservancy, have written detailed papers showcasing different perspectives on the legal and moral implications of Copilot and its business model. There's also currently a class-action lawsuit against GitHub for the service.
Yep. I'm not making a proclamation, just stating an opinion. I don't have a problem with what they're doing, and if other people do, that's fine. Some people like their cucumbers pickled, let them have their pickle.
I actually wouldn't be surprised to see it go open source in the future, Microsoft has been doing that a lot recently, like VScode and the whole of .NET and friends like PowerShell. Pretty much the only things worthwhile from Microsoft are already open source, except Copilot.