229
submitted 1 month ago* (last edited 1 month ago) by Dot@feddit.org to c/technology@lemmy.world
(page 2) 40 comments
sorted by: hot top controversial new old
[-] chakan2@lemmy.world 3 points 1 month ago

What's really ugly is it makes really good code with fucking terrible bugs. My last job for all of six weeks was trying to fix and integrations wrapper of an integrations wrapper on a 3rd party library of integrations.

It looked like really good code, but the architecture was fucked beyond repair. I was supposed to support it for a fortune 50. I quit before they could put me in the on call rotation.

Using an LLM to code is like using a block of wood as a hammer. It's fast, and it'll do a very specific thing quickly, but eventually it'll splinter to the point of uselessness.

I can't wait until all the copy/code is created by LLMs, and they run out of things to train the LLMs on that won't result in immediate hallucinations.

Maybe they'll start working on a real AI to replace the LLMs, instead of just marketing LLMs as AI.

(probably not)

[-] antihumanitarian@lemmy.world 2 points 1 month ago

I recently removed in editor AI cause I noticed I was acquiring muscle memory for my brain, not thinking through the rest past the start of a snippet that would get an LLM to auto complete. I'm still using LLMs, particularly for languages and libraries I'm not familiar with, but using the artifacts editors in ChatGPT and Claude.

[-] the_toast_is_gone@lemmy.world 0 points 1 month ago

Preface: If all you want is to get a simple script/program going that will more or less work for your purposes, then I understand using AI to make it. But doing much more than this with it will not help you.

If you want to actually learn to code, then using AI to write code for you is a crutch. It's like trying to learn how to write an essay by having ChatGPT write the essays for you. If you want to use an API in your code, then you're setting yourself up for greater failure the more you depend on AI.

Case in point: if you want to make a module or script for Foundry VTT, then they explicitly tell you not to use AI, partly because the models available online have outdated information. In fact, training AI on their documentation is explicitly against the terms of service.

Even if you do this and avoid losing your license, you run a significant risk of getting unusable code because the AI hallucinated a function or method that doesn't actually exist. You will likely wind up spending more time scouting the documents for what you actually want to do than if you'd just done it yourself to begin with.

And if the code works perfectly now, there's no guarantee that it will work forever, or even in the medium term. The software and API receive updates regularly. If you don't know how to read the docs and write the code you need, you're screwed when something inevitably gets deprecated and removed. The more you depend on AI to write it for you, the less capable you'll be of debugging it down the line.

This begs the question: why would you do any of this if you wanted to make something using an API?

[-] half_built_pyramids@lemmy.world 0 points 1 month ago

AI and the discussion around it doesn't live in a vacuum.

Occasionally you'll get shit opinions like this. Easy slutty greek frat bro strawmen that'll sleep with anything that moves and then dodge child support payments.

We all have to remember the true Chad argument against AI is that it's built on degenerate theft and corporate soulless shills. AI is the Shikrelli of creativity.

[-] ContrarianTrail@lemm.ee 0 points 1 month ago

Chad argument against AI

Generative AI*

[-] iopq@lemmy.world -2 points 1 month ago

I'd rather be a bad programmer that gets stuff done than a good programmer who's just jerking off about proper design

t. good programmer

load more comments (4 replies)
[-] pHr34kY@lemmy.world -3 points 1 month ago* (last edited 1 month ago)

The same logic can explain why Teslas crash so often. You turn on all the assists, and eventually forget how to change gears.

[-] ContrarianTrail@lemm.ee 2 points 1 month ago

I haven't seen any statistics suggesting Teslas crash more often than other vehicles.

[-] pHr34kY@lemmy.world 2 points 1 month ago* (last edited 1 month ago)

Tesla Drivers Have the Highest Crash Rate of Any Brand

Tesla also had the lowest DUI rate. They're not intoxicated, they're just crap drivers.

[-] higgsboson@dubvee.org 1 points 1 month ago

23.54 crashes per 1,000 drivers in the past year.

If someone cared about actually comparing this, they'd normalize for the number of miles driven, instead of cherry-picking something so it sounds bad.

This is disengenuous.

[-] pHr34kY@lemmy.world 2 points 1 month ago

Teslas aren't exactly long-range rural cars. Correcting for mileage would only increase the crash rates.

However, being city cars, they probably have high operational hours.

[-] higgsboson@dubvee.org 2 points 1 month ago

Yeah good point, the type of miles probably matters a lot.

load more comments
view more: ‹ prev next ›
this post was submitted on 22 Oct 2024
229 points (86.8% liked)

Technology

59648 readers
1495 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS