63
The End of Coding? Wrong Question (www.architecture-weekly.com)

What LLMs revealed is how many people in our industry don't like to code.

It's intriguing that now they claim and showcase what they "built with Claude", whereas usually that means they generated a PoC.

It's funny, as people still focus on how they're building, so it's all about the code. And if that's the message sent outside, together with the thought that LLMs are already better than "average coder Joe", then the logical follow-up question is: why do we need those humans in the loop?

you are viewing a single comment's thread
view the rest of the comments
[-] moto@programming.dev 6 points 10 hours ago

I generally agree with what the post is saying but this part

I think that we'll still be coding, but with some other layer, as LLMs are good with structured input, like programming languages. So we might need other programming languages than we have atm. Might we need different tools to evaluate LLMs' output to make it deterministic? Might we need a different approach for engineering to make it scalable? Might we need more?

I just don't see this happening to be honest. It's the same thing people keep claiming about "prompts replacing code"

Let's say you do make it deterministic. Then why do you need the LLM for it? You can just build a plain old compiler for it. Why add Anthropic or Open AI as an expensive middleman to your operations. There's already a lot of admin plugins that will set up entire routes and pages based off of a db model. The reason people don't purely work off of those is the world isn't modeled off of simple CRUD. There are so many edge cases and requirements that aren't easy to model in a sweeping generalization that you need some way of fine tuning that.

So if you scrap that you're back to "prompts as code". Which also sucks.

If you have a PR change that's breaking production and the only change is to a prompt

Make the popup background ~~red~~ blue

How the hell do you triage what went wrong? Do you revert and roll the dice that the LLM is gonna get it right? No one in their right mind would ever think this is okay in a production setting?

I don't want to say we'll never have a higher level extraction, but I don't think it'll be due to LLMs.

this post was submitted on 09 Mar 2026
63 points (92.0% liked)

Programming

25993 readers
170 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS