43
top 50 comments
sorted by: hot top controversial new old
[-] kescusay@lemmy.world 13 points 1 week ago* (last edited 1 week ago)

Experienced software developer, here. "AI" is useful to me in some contexts. Specifically when I want to scaffold out a completely new application (so I'm not worried about clobbering existing code) and I don't want to do it by hand, it saves me time.

And... that's about it. It sucks at code review, and will break shit in your repo if you let it.

[-] billwashere@lemmy.world 6 points 1 week ago

Not a developer per se (mostly virtualization, architecture, and hardware) but AI can get me to 80-90% of a script in no time. The last 10% takes a while but that was going to take a while regardless. So the time savings on that first 90% is awesome. Although it does send me down a really bad path at times. Being experienced enough to know that is very helpful in that I just start over.

In my opinion AI shouldn’t replace coders but it can definitely enhance them if used properly. It’s a tool like everything. I can put a screw in with a hammer but I probably shouldn’t.

[-] kescusay@lemmy.world 4 points 1 week ago

Like I said, I do find it useful at times. But not only shouldn't it replace coders, it fundamentally can't. At least, not without a fundamental rearchitecturing of how they work.

The reason it goes down a "really bad path" is that it's basically glorified autocomplete. It doesn't know anything.

On top of that, spoken and written language are very imprecise, and there's no way for an LLM to derive what you really wanted from context clues such as your tone of voice.

Take the phrase "fruit flies like a banana." Am I saying that a piece of fruit might fly in a manner akin to how another piece of fruit, a banana, flies if thrown? Or am I saying that the insect called the fruit fly might like to consume a banana?

It's a humorous line, but my point is serious: We unintentionally speak in ambiguous ways like that all the time. And while we've got brains that can interpret unspoken signals to parse intended meaning from a word or phrase, LLMs don't.

load more comments (4 replies)
[-] MangoCats@feddit.it 3 points 1 week ago

I have limited AI experience, but so far that's what it means to me as well: helpful in very limited circumstances.

Mostly, I find it useful for "speaking new languages" - if I try to use AI to "help" with the stuff I have been doing daily for the past 20 years? Yeah, it's just slowing me down.

[-] balder1991@lemmy.world 3 points 1 week ago* (last edited 1 week ago)

I like the saying that LLMs are “good” at stuff you don’t know. That’s about it.

When you know the subject it stops being much useful because you’ll already know the very obvious stuff that LLM could help you.

[-] FreedomAdvocate@lemmy.net.au 5 points 1 week ago

They're also bad at that though, because if you don't know that stuff then you don't know if what it's telling you is right or wrong.

[-] fafferlicious@lemmy.world 1 points 1 week ago

I...think that's their point. The only reason it seems good is because you're bad and can't spot that is bad, too.

[-] MangoCats@feddit.it 1 points 1 week ago

To be fair, when you're in Gambukistan and you don't even know what languages are spoken, a smart phone can bail you out and get you communicating basic needs much faster and better than waving your hands and speaking English LOUDLY AND S L O W L Y . A good human translator, you can trust, should be better - depending on their grasp of English, but there's another point... who do you choose to pick your hotel for you? Google, or a local kid who spotted you from across the street and ran over to "help you out"? That's a tossup, both are out to make a profit out of you, but which one is likely to hurt you more?

[-] Zetta@mander.xyz 2 points 1 week ago

FreedomAdvocate is right, IMO the best use case of ai is things you have an understanding of, but need some assistance. You need to understand enough to catch atleast impactful errors by the llm

load more comments (1 replies)
load more comments (1 replies)
[-] CabbageRelish@midwest.social 3 points 1 week ago* (last edited 1 week ago)

On that last note, important thing they left out here being general news reporting tech stuff is that this was specifically bug fixing tasks. It can typically only provide the broadest of advice on that, and it’s largely incapable of tackling problems holistically when you often need to be thinking big picture while tackling a bug.

Interesting that the AI devs thought they were being quicker though.

[-] FreedomAdvocate@lemmy.net.au 2 points 1 week ago* (last edited 1 week ago)

I've found it to be great at writing unit tests too.

I use github copilot in VS and it's fantastic. It just throws up suggestions for code completions and entire functions etc, and is easily ignored if you just want to do it yourself, but in my experience it's very good.

Like you said, using it to get the meat and bones of an application from scratch is fantastic. I've used it to make some awesome little command line programs for some of my less technical co-workers to use for frequent tasks, and then even got it to make a nice GUI over the top of it. Takes like 10% of the time it would have taken me to do it - you just need to know how to use it, like with any other tool.

load more comments (35 replies)
[-] neclimdul@lemmy.world 11 points 1 week ago

Explain this too me AI. Reads back exactly what's on the screen including comments somehow with more words but less information Ok....

Ok, this is tricky. AI, can you do this refactoring so I don't have to keep track of everything. No... Thats all wrong... Yeah I know it's complicated, that's why I wanted it refactored. No you can't do that... fuck now I can either toss all your changes and do it myself or spend the next 3 hours rewriting it.

Yeah I struggle to find how anyone finds this garbage useful.

[-] SpaceCowboy@lemmy.ca 6 points 1 week ago

You shouldn't think of "AI" as intelligent and ask it to do something tricky. The boring stuff that's mostly just typing, that's what you get the LLMs to do. "Make a DTO for this table " "Interface for this JSON "

I just have a bunch of conversations going where I can paste stuff into and it will generate basic code. Then it's just connecting things up, but that's the fun part anyway.

[-] neclimdul@lemmy.world 3 points 1 week ago

Most ides do the boring stuff with templates and code generation for like a decade so that's not so helpful to me either but if it works for you.

load more comments (1 replies)
[-] Damaskox@lemmy.world 5 points 1 week ago

I have asked questions, had conversations for company and generated images for role playing with AI.

I've been happy with it, so far.

load more comments (3 replies)
[-] Sl00k@programming.dev 1 points 1 week ago

This was the case a year or two ago but now if you have an MCP server for docs and your project and goals outlined properly it's pretty good.

[-] turtlesareneat@discuss.online 2 points 1 week ago* (last edited 1 week ago)

Not to sound like one of the ads or articles but I vice coded an iOS app in like 6 hours, it's not so complex I don't understand it, it's multifeatured, I learned a LOT and got a useful thing instead of doing a tutorial with sample project. I don't regret having that tool. I do regret the lack of any control and oversight and public ownership of this technology but that's the timeline we're on, let's not pretend it's gay space communism (sigh) but, since AI is probably driving my medical care decisions at the insurance company level, might as well get something to play with.

[-] KairuByte@lemmy.dbzer0.com 1 points 1 week ago

If you give it the right task, it’s super helpful. But you can’t ask it to write anything with any real complexity.

Where it thrives is being given pseudo code for something simple and asking for the specific language code for it. Or translate between two languages.

That’s… about it. And even that it fucks up.

load more comments (1 replies)
load more comments (1 replies)
[-] ptz@dubvee.org 6 points 1 week ago
[-] resipsaloquitur@lemmy.world 4 points 1 week ago

Writing code is the easiest part of my job. Why are you taking that away?

[-] Feyd@programming.dev 3 points 1 week ago

Fun how the article concludes that AI tools are still good anyway, actually.

This AI hype is a sickness

[-] OpenPassageways@programming.dev 3 points 1 week ago

Yeah... It's useful for summarizing searches but I'm tempted to disable it in VSCode because it's been getting in the way more than helping lately.

[-] FreedomAdvocate@lemmy.net.au 3 points 1 week ago

"Using something that you're not experienced with and haven't yet worked out how to best integrate into your workflow slows some people down"

Wow, what an insight! More at 8!

As I said on this article when it was posted to another instance:

AI is a tool to use. Like with all tools, there are right ways and wrong ways and inefficient ways and all other ways to use them. You can’t say that they slow people down as a whole just because some people get slowed down.

load more comments (1 replies)
[-] xep@fedia.io 2 points 1 week ago

Code reviews take up a lot of time, and if I know a lot of code in a review is AI generated I feel like I'm obliged to go through it with greater rigour, making it take up more time. LLM code is unaware of fundamental things such as quirks due to tech debt and existing conventions. It's not great.

load more comments (2 replies)
[-] worldistracist@lemmy.cafe 2 points 1 week ago

Great! Less productivity = more jobs, more work security.

[-] FancyPantsFIRE@lemmy.world 2 points 1 week ago

I’ve used cursor quite a bit recently in large part because it’s an organization wide push at my employer, so I’ve taken the opportunity to experiment.

My best analogy is that it’s like micro managing a hyper productive junior developer that somehow already “knows” how to do stuff in most languages and frameworks, but also completely lacks common sense, a concept of good practices, or a big picture view of what’s being accomplished. Which means a ton of course correction. I even had it spit out code attempting to hardcode credentials.

I can accomplish some things “faster” with it, but mostly in comparison to my professional reality: I rarely have the contiguous chunks of time I’d need to dedicate to properly ingest and do something entirely new to me. I save a significant amount of the onboarding, but lose a bunch of time navigating to a reasonable solution. Critically that navigation is more “interrupt” tolerant, and I get a lot of interrupts.

That said, this year’s crop of interns at work seem to be thin wrappers on top of LLMs and I worry about the future of critical thinking for society at large.

load more comments (1 replies)
[-] astronaut_sloth@mander.xyz 2 points 1 week ago

I study AI, and have developed plenty of software. LLMs are great for using unfamiliar libraries (with the docs open to validate), getting outlines of projects, and bouncing ideas for strategies. They aren't detail oriented enough to write full applications or complicated scripts. In general, I like to think of an LLM as a junior developer to my senior developer. I will give it small, atomized tasks, and I'll give its output a once over to check it with an eye to the details of implementation. It's nice to get the boilerplate out of the way quickly.

Don't get me wrong, LLMs are a huge advancement and unbelievably awesome for what they are. I think that they are one of the most important AI breakthroughs in the past five to ten years. But the AI hype train is misusing them, not understanding their capabilities and limitations, and casting their own wishes and desires onto a pile of linear algebra. Too often a tool (which is one of many) is being conflated with the one and only solution--a silver bullet--and it's not.

This leads to my biggest fear for the AI field of Computer Science: reality won't live up to the hype. When this inevitably happens, companies, CEOs, and normal people will sour on the entire field (which is already happening to some extent among workers). Even good uses of LLMs and other AI/ML use cases will be stopped and real academic research drying up.

[-] 5too@lemmy.world 2 points 1 week ago

My fear for the software industry is that we'll end up replacing junior devs with AI assistance, and then in a decade or two, we'll see a lack of mid-level and senior devs, because they never had a chance to enter the industry.

[-] squaresinger@lemmy.world 1 points 1 week ago

That's happening right now. I have a few friends who are looking for entry-level jobs and they find none.

It really sucks.

That said, the future lack of developers is a corporate problem, not a problem for developers. For us it just means that we'll earn a lot more in a few years.

[-] 5too@lemmy.world 1 points 1 week ago

You're not wrong, and I feel like it was a developing problem even before AI - everybody wanted someone with experience, even if the technology was brand new.

That said, even if you and I will be fine, it's still bad for the industry. And even if we weren't the ones pulling up the ladder behind us, I'd still like to find a way to start throwing ropes back down for the newbies...

load more comments (3 replies)
[-] bassomitron@lemmy.world 2 points 1 week ago

Couldn't have said it better myself. The amount of pure hatred for AI that's already spreading is pretty unnerving when we consider future/continued research. Rather than direct the anger towards the companies misusing and/or irresponsibly hyping the tech, they direct it at the tech itself. And the C Suites will of course never accept the blame for their poor judgment so they, too, will blame the tech.

Ultimately, I think there are still lots of folks with money that understand the reality and hope to continue investing in further research. I just hope that workers across all spectrums use this as a wake up call to advocate for protections. If we have another leap like this in another 10 years, then lots of jobs really will be in trouble without proper social safety nets in place.

[-] stsquad@lemmy.ml 1 points 1 week ago

They can be helpful when using a new library or development environment which you are not familiar with. I've noticed a tendency to make up functions that arguably should exist but often don't.

Excellent take. I agree with everything. If I give Claude a function signature, types and a description of what it has to do, 90% of the time it will get it right. 10% of the time it will need some edits or efficiency improvements but still saves a lot of time. Small scoped tasks with correct context is the right way to use these tools.

load more comments (4 replies)
load more comments
view more: next ›
this post was submitted on 11 Jul 2025
43 points (97.8% liked)

Technology

73130 readers
979 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS