461
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 09 Feb 2024
461 points (96.0% liked)
Linux
48199 readers
1186 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
I remember when gitlab.com was the most accessible alternative to GitHub out there, but it seems they're only interested in internal enterprise usage now. Their main page was already completely unreadable to someone not versed in enterprise tech marketing lingo, and now this.
Thankfully Gitea and Forgejo have gotten better in the meantime, with Codeberg as a flagship instance of the latter.
On a tangent, why are all of these companies pushing AI programming? This shit isn't nearly as functional as they make it seem and all the beginners who try it are constantly asking questions about why their generated code doesn't work
We are in the hype cycle so everyone is going bananas and there's money to be made prior to the trough of disillusionment.
Haha so true.
I tried to use chatgpt to convert a monstrosity of a SQL query to a sqlalchemy query and it failed horribly.
It's their wet dream. Making software without programmers.
Execs have never cared about the technology or the engineering side of it. If you could make software by banging on a pot while dancing naked around the fire, they'd have been ok with that.
And now that AI has come along that's basically what it looks like to them.
VC's and companies like OpenAI have done a really good job of propagandizing AI (LLMs). People think it's magical and the future, so there's money in saying you have it.
Because it brings in mad VC funding
Because it ain't here to generate all their code for them. It's a glorified autocomplete and suggestion engine. When are people gonna get this? (not you, just in general)
I use CoPilot myself, but if you have absolutely no idea what you're doing yourself, you and CoPilot will both quickly hit a dead end together. It doesn't actually understand what you want the code to do. Only what is similar to what you have already written or prompted for, which may be some garbage picked up from a noob on the web somewhere. Books and research using your meatbrain are still very much needed.
It's not in the interest of all the techbros to sell the new age AIshit as something less that can only do such small thing. They need to hype the shit out of it to get all the crazy investors money that understand nothing about it but only see AI buzzwords everywhere and need to go for it now because of FOMO.
It's only gonna get much worse before it is toned down to appropriate usage.
Don't even need to make it about code. I once asked what a term meant in a page full of a certain well known FOSS application's benchmarks page. It gave me a lot of garbage that was unrelated because it made an assumption about the term, exactly the assumption I was trying to avoid. I try to deviate it away from that, and it fails to say anything coherent and then loops back and gives that initial attempt as the answer again. I was stuck unable from stopping it from hallucinating.
How? Why?
Basically, it was information you could only find by looking at the github code, and it was pretty straightforward - but the LLM sees "benchmark" and it must therefore make a bajillion assumptions.
Even if asked not to.
I have a conclusion to make. It does do the code thing too, and it is directly related. Once asked about a library, and it found a post where someone was ASKING if XYZ was what a piece of code was for - and it gave it out as if it was the answer. It wasn't. And this is the root of the problem:
AI's never say "I don't know".
It must ALWAYS know. It must ALWAYS assume something, anything, because not knowing is a crime and it won't commit it.
And that makes them shit.
Because greedy investors are gullible and want to make money from the jobs they think AI will displace. They don't know that this shit doesn't work like they've been promised. The C-levels at Gitlab want their money (gotta love publicly traded companies), and nobody is listening to the devs who are shouting that AI is great at writing security vulnerabilities or just like, totally nonfunctioning code.
I'm hyped about AI assisted programming and even agent driven projects (writing their own code, submitting pull requests etc) but I also agree that it seems just too early to actually put money behind it.
Its just so marginal so far, the UI/HMI has too much friction still and the output without skilled programming assistance is too limited.
For my private repos, hosted on my home server, I moved from Gitlab to Forgejo (Git, artifacts and containers images) and Woodpecker for CI builds. Woodpecker is not as powerful and feature complete as Gitlab, but for simpler needs it gets the job done.