267
Shouting into the void
(lemmy.ml)
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
Agreed, you absolutely can find similar complaints about search engines, and there were similar fools back then who relied wholesale on search results and nothing further.
I'm looking for people who can problem-solve, not just click-click-next use tools. When search engines made life easier, the folks who didn't try anything past searching google just didn't advance in tech fields if they couldn't get it done. The people I'm talking about now are walking into jobs that require thinking while literally proclaiming that they let something else do the thinking for them.
What am I supposed to do with a tech who can't get past an ansible deployment because he couldn't figure out how to find and use the ansible wiki? As I plainly said, it's not the technology, it's the culture.
Your "boomer" take on this isn't valid because I'm also getting the AI-bro talk from idiots my age as well.
Last, I'd like to point out that you don't know what gatekeeping means. Maybe chatgpt can help you.
I do find it amusing that “AI Bros” seems to have very little literacy of how the instruct LLM tools actually work, even at the executive level of companies. It reflects the sort of “confident ignorance” you see with being unable to look up documentation and such.
I find this frustrating because I consider myself a local LLM tinkerer and find it… Isolating. I love the problem solving loop and hammering gritty details out to coax decent tokens out quickly, or getting it to use documentation/forced syntax effectively, or assembling a bizzare dataset for some weird style or dialect, and that’s like antithetical to the culture.
I guess what I’m saying is there are problem solvers who use LLMs. Maybe they’re pink elephants.
Barring terrible company policy, those people who can't do more than allow the AI to think for them will find themselves out of a job pretty quickly if they can't do the work themselves.
Bad employees and stupid people will always be stupid, the newest tech isn't the cause of that. Maybe it lowers the bar of entry a bit?
Lastly, you're clearly gatekeeping "the ability to critically think" based on some arbitrary conditions you made up based on "how things used to be", so to speak. Maybe you could have used your superior boomer wisdom to figure that one out.