112

And the real maddening part is that search engines have been so enshitfied to make way for AI that's wrong like 9/10, so you're forced to rely on it for answers because if you try google, the snake wraps around and eats it's own tail giving you an AI answer! stalin-stressed

(page 2) 23 comments
sorted by: hot top controversial new old
[-] tamagotchicowboy@hexbear.net 5 points 2 weeks ago

Only little bits and pieces for projects I have so many backups I'd laugh if the LLM fucked it up, noticed they're heavily trained on python but near nothing on pascal. I use glm (deepseek, kimi etc) mostly for coding, I get banned just looking at chatgpt. I've abandoned google like a one way time capsule to 1997.

[-] ProletarianDictator@hexbear.net 3 points 2 weeks ago

Quality is noticably worse for less used languages and frameworks.

[-] iByteABit@hexbear.net 4 points 2 weeks ago* (last edited 2 weeks ago)

I do use it as a better way to search for things that have too much context to fit into search engine keywords, but using it for any real engineering is always extremely underwhelming and infuriating.

I am convinced that the hyped AI bros either weren't doing very hard engineering to begin with, or they are lying out of their ass with how useful this bullshit is.

What I hate most about it though is the effect it has on your own brain when you get too used to it, it really makes you worse at thinking in general and being creative. I really fear the long term societal effect it will have on people as it becomes more widely used to replace all thinking that people are too lazy to do.

[-] buckykat@hexbear.net 3 points 2 weeks ago

Critical support to the slop generators in telling windows users to break their installs

[-] homhom9000@hexbear.net 3 points 2 weeks ago

So many helper functions. All I said was, use a json file to create sql insert statements and have the date in timestamp format, expecting it to use To_timestamp. It created a helper function for parsing each datepart, then another to cast the result into To_timestamp.

[-] ProletarianDictator@hexbear.net 2 points 2 weeks ago

I have found LLMs quite disappointing when writing code.

LLMs are useful for learning new libraries and scaffolding starter projects and maybe filling in a simple function body. But I rarely get purely generative output I would consider close to production-ready, even when it compiles or runs without error. To get non garbage at all, you must be very precise and ask it "implement [insert some formal data structure / algorithm / pattern] to do [specific task]" rather than asking it to produce code that does your thing. Even then, I find it more useful to ask for general strategies, related concepts, and some example code that would be useful to implement what I want.

All of this requires a pretty substantial skepticism of the output that people hyping up AI tools are completely lacking. Most people use these tools to avoid the difficult thinking necessary to solve a problem, so why would they put in that same level of thinking required to vet the output? And if you don't have enough knowledge of a framework, language, library, etc. to use it effectively or read / write the code yourself, you don't have the knowledge required to vet and maintain code produced by LLMs, let alone put it in production. I've had so many instances of LLMs writing code that would require a computer science education to understand why it is a bad idea. Anyone with that knowledge is better off implementing the thing directly instead of figuring out how to message their prompt or torture the output into something good.

LLMs repeatedly producing output you cannot or do not fully understand reenforces the view that your abilities are enhanced by the LLM. This, combined with the imposter syndrome that is rampant amongst devs is going to result in a lot of deferring to uncritically accepting bad code from LLMs.

Soon tons of mediocre devs will be producing mass quantities of code they're not capable or diligent enough to understand, resulting in huge, lumbering codebases full of bugs and bad design choices. In my career, the most common barrier to implementing anything or moving a project forward has been technical debt. LLMs are going to greatly increase the rate at which technical debt is produced and reduce the ability of people to tackle that technical debt, since they are no longer familiar with the codebase.

This phenomena is why I think LLM code gen is going to be a net productivity drain.

As always, the core problem with LLMs is not that they are frequently incorrect, it is that them being correct often enough lulls humans into foregoing their due diligence, typically in favor of having a proprietary product serve as a substitute for their critical thinking.

This is not unique to programmers, as I now see tons of people citing ChatGPT or Gemini as if they were authoritative sources on anything. We will see the effects of this in all aspects of society.

[-] YiddishMcSquidish@lemmy.today 2 points 2 weeks ago

Here comes a highly controversial opinion.

Let me preface this with I'm anti AI, I wish Iran kept its mouth shut about destroying open's big facility and just did it. Seeing tech bros get the French revolution treatment would bring a smile to my face. And I avoid using it at all as best I can.

But I hit a breaking point yesterday with a not very popular Metroidvania I got on humble bundle called "kingdom shell". Great game with glorious atmosphere, but some very poor pacing and a few confusing puzzles. I got through most of them but one of the puzzles had me pulling my non existent hair out.

I tried normal searches, found one fairly comprehensive guide that was no help in this part specifically. I asked Gemini and I'll be damned if it didn't actually come up with a good answer.

I know my sample size of n=1 does not a p value of ≤.05 and I'm not changing my mind about using it more now. But in my one very specific instance it was a little help.

load more comments
view more: ‹ prev next ›
this post was submitted on 04 Apr 2026
112 points (99.1% liked)

technology

24339 readers
69 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS