205
top 50 comments
sorted by: hot top controversial new old
[-] talkingpumpkin@lemmy.world 2 points 14 hours ago

This isn’t a rant about AI.

I feared thus would be about AI, but... this might actually be interesting! I'm glad I started reading.

This time is different [...] Previous technology shifts were “learn the new thing, apply existing skills.” AI isn’t that.

Well f*ck you and give me back the time I wasted on that article.

Guys, can we add a rule that all posts that deal with using LLM bots to code must be marked? I am sick of this topic.

[-] codeinabox@programming.dev 1 points 13 hours ago

Guys, can we add a rule that all posts that deal with using LLM bots to code must be marked? I am sick of this topic.

How would you like them to be marked? AFAIK Lemmy doesn't support post tags

[-] ProdigalFrog@slrpnk.net 1 points 11 hours ago* (last edited 9 hours ago)

Could put [AI Subject] in the title.

[-] talkingpumpkin@lemmy.world 1 points 9 hours ago

Actual technical articles about LLM/diffusion would be interesting to read (I think?)... maybe something like [vibecoding]?

Actually, let's make that generic and use [futurology], so that it may apply regardless of whether the incumbent revolution/menace is LLMs, low code tools, or stack overflow.

[-] ICastFist@programming.dev 4 points 18 hours ago

They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.

Yeah, I noticed something was off around the time every new app was essentially "the fucking website on a self contained chrome browser", aka electron. Sure, it was sold off as being a "write once, run everywhere", but a significant number of said programs and games were still either windows or android exclusive because ha ha fuck you (anyone that's dealt with RPM MV/MZ knows it).

Some layers of abstraction are incredibly welcome - not having to worry about cycles and RAM addressing, having immediate access to graphics without having to reinvent pixel pushing function; but (imo) everything on top of a browser is just an endless series of "why?"

[-] fubarx@lemmy.world 14 points 1 day ago* (last edited 1 day ago)

Wasn't "lo-code" a BIG thing a few years ago... that would destroy programming and make every PM a developer? Whatever happened to that? 🤔

Edit: read the HN comments. If I ever go back to consulting, I'm 10x-ing my rate to work on cleaning up this slop. I'm not anti-AI coding and use it for my own projects, but if you just give it a prompt and walk away, you will be very sad later.

There's a BIG difference between prototypes and something others have to use. As the lo-code folks found out the hard way.

[-] wer2@lemmy.zip 3 points 7 hours ago
  1. Programming language invented: Everyone can code now! Programming is dead!
  2. Cobol invented: Everyone can code now! Programming is dead!
  3. BASIC invented: Everyone can code now! Programming is dead!
  4. UML to Code/Executable UML: Everyone can code now! Programming is dead!
  5. Low code: Everyone can code now! Programming is dead!
  6. AI: Everyone can code now! Programming is dead!

Yep, any day now...

[-] Cryxtalix@programming.dev 1 points 20 hours ago

Aren't professional tools using node-based compositing, which ends up being just as complicated as code in big projects? They'll do anything to hide the code, because code is scary, even if it's the same.

[-] sping@lemmy.sdf.org 2 points 23 hours ago

Thing is, cleaning up slop, AI or otherwise, is miserable, slow, difficult work. It's why we get caught in rewrite traps.

[-] ieGod@lemmy.zip 18 points 1 day ago* (last edited 1 day ago)

I feel like a lot of concepts are being conflated.

Plug and Play arrived. Windows abstracted everything. The Wild West closed. Computers stopped being fascinating, cantankerous machines that demanded respect and understanding, and became appliances. The craft became invisible.

I disagree with this. The craft is still alive and well, it's just specialized. As complexities in approaches grow, it's not possible for a single person to know every register of every subunit all the way up to high level application software in any reasonable manner. You can totally write your own bootloader for your current hardware. Nothing is stopping you. Is the argument that the financial utility is lowered? Is it that he chose voluntarily to focus on application layer development?

This is like someone who built their own bicycle lamenting they can't do the same for a modern EV.

[-] groucho@lemmy.sdf.org 12 points 1 day ago

I wonder what the venn diagram of people that started coding as kids and people that enjoy vibe coding looks like. Informally, the degens on my squad that started on their parents' computer loathe AI, and the people that stumbled into it in college are all about the vibe code.

[-] Buddahriffic@lemmy.world 2 points 14 hours ago

I'm one of those the fits in both categories. I've been blown away by what these AI agents are capable of. I've "written" a bunch of scripts that involve parsing and generating code for another tool to consume and it's been able to take over the tedious parts, like writing a function to parse the parameters out of this code, then follow the code it goes into and extract the relationships between the parameters and recreate them another way. It's something I could write the code for, but that code will be mostly undocumented, will contain "quick version that I'll come back later and fix up (but I never get to it because if it works, there's other more productive things to do)", plus some debug code that I'm not sure if I'll need again so it's just there so I can uncomment it instead of writing it again. Not to mention all the typos and sloppy errors along the way that may or may not be easy to find later during compile and testing.

I consider myself a competent coder. AI makes me better, more focused and less sloppy. But that said, my prompts reflect that. I understand that these models aren't really programmers but just correlation engines that have been trained on a ton of programming material. It can tell you the traveling salesman problem is NP but won't necessarily realize that the problem you've asked it to solve is equivalent to the traveling salesman problem. It will happily spit out an identical function to one it did before, just with name differences that are specific to the current thing it is doing rather than just calling the same function. It will pick the least efficient way to do some things. It's not a problem solver, it's a solution predictor, which sounds better but isn't.

So I consider them more like force multipliers rather than adders. If you have the skills, I believe you could use an LLM to make anything (as a development cycle, not "spits out perfect implementation first try"), but if you don't have the skills, you'll struggle a lot even on fairly basic shit simply because you don't how to direct the LLM properly.

But I still watch it produce code with a mixture of awe and fear. I don't think the above will be true forever. Maybe not even for the rest of the 20s.

[-] zod000@lemmy.dbzer0.com 4 points 1 day ago

Probably two distinct circles that don't touch if I base it on people I know and myself.

[-] entwine@programming.dev 9 points 1 day ago

I had a realization recently. All the pro-AI people pushing vibe coding or "coding assistants" are completely missing the point.

These tools aren't helping you write code, you are helping the tool write code, because it can't do it on its own yet. The more they improve, the less you're needed.

Idk if they'll ever reach the point where you can actually give it a prompt, and it'll provide a fully functional implementation on its own with no human intervention required. If it does, I can't imagine that tech would be as available as it is now. Your peasant ass isn't going to be vibing the next big thing that's for sure.

[-] codeinabox@programming.dev 112 points 2 days ago

This quote on the abstraction tower really stood out for me:

I saw someone on LinkedIn recently — early twenties, a few years into their career — lamenting that with AI they “didn’t really know what was going on anymore.” And I thought: mate, you were already so far up the abstraction chain you didn’t even realise you were teetering on top of a wobbly Jenga tower.

They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.

But sure. AI is the moment they lost track of what’s happening.

The abstraction ship sailed decades ago. We just didn’t notice because each layer arrived gradually enough that we could pretend we still understood the whole stack. AI is just the layer that made the pretence impossible to maintain.

[-] Feyd@programming.dev 58 points 2 days ago

LLMs don't add an abstraction layer. You can't competently produce software without understanding what they're outputting.

[-] chicken@lemmy.dbzer0.com 29 points 2 days ago

The author's point is that people already don't understand what the programs they write do, because of all the layered abstraction. That's still true whether or not you want to object to the semantics of calling the use of LLMs an abstraction layer.

[-] Feyd@programming.dev 33 points 2 days ago

Not knowing what cpu instructions your code compiles to and not understanding the code you are compiling are completely different things. This is yet another article talking up the (not real) capability of LLM coding assistants, though in a more round about way. In fact, this garbage blogspam should go on the AI coding community that was made specifically because the subscribers of the programming community didn't want it here, yet we keep getting these trying to skirt the line.

[-] codeinabox@programming.dev 14 points 2 days ago

In fact, this garbage blogspam should go on the AI coding community that was made specifically because the subscribers of the programming community didn't want it here.

This article may mention AI coding but I made a very considered decision to post it in here because the primary focus is the author's relationship to programming, and hence worth sharing with the wider programming community.

Considering how many people have voted this up, I would take that as a sign I posted it in the appropriate community. If you don't feel this post is appropriate in this community, I'm happy to discuss that.

load more comments (5 replies)
load more comments (4 replies)
load more comments (4 replies)
[-] idunnololz@lemmy.world 12 points 2 days ago* (last edited 2 days ago)

I've had this problem with abstractions for the longest time. Of course whenever I say anything negative about abstractions I just get dog piled so I don't usually like to discuss the topic.

I think abstractions as a tool is fine. My problem with abstractions is that most developers I meet seem to only talk about the upsides of abstractions and they never take into account the downsides seriously.

More often then not, I just think people treat abstractions as this magical tool you cant over use. In reality, over use of abstractions can increase complexity and reduce readability. They can greatly reduce the amount of assumptions you can make about code which has many many additional downsides.

Of course I'm not saying we shouldnt use abstractions. Not having any abstractions can be just as bad as having too many. You end up with similar issues such as increased complexity and reduced readability.

The hard part is finding the balance, the sweet spot where complexity is minimized and readability is maximized while using the fewest amount of abstractions possible.

I think too often, developers would err on the side of caution and add more abstractions then necessary and call it good enough. Developers really need to question if every abstraction is absolutely necessary. Is it really worth it to add an additional layer of abstraction just because a problem might arise in the future vs reducing the number of abstractions and waiting for it to become a problem before adding more abstractions. I don't think we do the latter enough. Often times you can get away with slightly less abstractions than you think you need because you will never touch the code again.

[-] queerlilhayseed@piefed.blahaj.zone 7 points 2 days ago* (last edited 2 days ago)

I think the design of interfaces is a difficult and subtle art. You have to have a very refined understanding of both of the layers being interfaced, so you can understand where the boundary should go and what shape it should have so concepts don't get split across layers. You also need to have a keen understanding of how the humans using the interface will behave in the future, which is really hard and often impossible. I think that's why interfaces tend to evolve over time along with the tech, because assumptions built into them were either incorrect, or became incorrect (or just confusing) as the technical landscape shifts around them.

Speaking of shifting landscapes, I think one of the fundamental practices of engineering is prototyping: building a thing that does what you think you want, even if it's janky or unscalable or has an insane cyclomatic complexity or w/e. Sometimes building the janky version can lead to insights into how an improved version can be made; insights that would be very difficult or impossible to predict if one tried to build the perfect version on the first go.

This causes some problems in corporate development, because the chance to learn from a model and iterate on it directly is so rare. The vast majority of the time (IME), as soon as the janky version fulfills the client's list, it moves into production and the improvements are, if not written off entirely, put on the backlog of tasks that will never be more important than building out the next new thing. It's still possible to iterate ideas in future new projects, it happens all the time, but it's different than building one thing and refining it in an iterative development cycle over a long term.

I feel like they kind of lost the thread here, even though I think I agree with the idea that vibe coding is a fundamentally different thing than another layer of abstraction. There's no need to punch on the web developers. We've all known, for the last several decades at least, that we don't have to understand the entire mechanism completely. No one is out there doping their own silicon and writing JS apps to run on it. The whole point of layered abstractions is that you can build on a set of assumptions without having to know all the implementation details of the layers below. When an abstraction leaks, then it can be useful to have some facility with the lower levels, but no person alive is a master of the full stack. The beautiful thing about abstractions is that you don't have to be. That's still true with vibe coding, just with the extra complexity of having a ticker tape spitting out semi-coherent code faster than any human could type it, which moves the dev from a creative role to more of an administrative one, as they mention earlier in the piece, which 1) is not nearly as fun, and crucially 2) doesn't help you build the muscles that make one good at code administration.

load more comments (8 replies)
load more comments (16 replies)
[-] sobchak@programming.dev 6 points 1 day ago

Just don't use AI coding tools then? Studies show they make people less productive anyways.

[-] bridgeenjoyer@sh.itjust.works 5 points 1 day ago

I wish i could have started then! I'm not really interested in modern coding, it doesn't seem as interesting...I kind of want to read the whole c64 manual and try programming on it, but I guess there isn't much point to that nowadays

[-] Valmond@lemmy.dbzer0.com 10 points 1 day ago

Of course there is, it is fun!

What else do you think the "point" of living is?

[-] lurker2718@lemmings.world 5 points 1 day ago

You could also try microcontroller programming. There you have control over everything and i guess it should be similiar to programming in early days. I started around ten years ago with Arduino and also directly programming them and it still interests me. And with a little hardware you could do some interesting projects. I would even say it could be relevant for career if you want that.

[-] groucho@lemmy.sdf.org 2 points 1 day ago

This guy disagrees: https://www.youtube.com/watch?v=Ws4twUyt-MY&pp=ygUNYzY0IDkgc3ByaXRlcw%3D%3D

There's a thriving C64 scene today. I'm more in the gameboy/DMG side of things but videos like that make me want to check it out further.

[-] TheGiantKorean@lemmy.today 21 points 2 days ago

Man. I so feel this. I'm 51 and started programming when I was 10. It's not anything like it used to be. I miss those days.

[-] sping@lemmy.sdf.org 29 points 2 days ago

It's so much better! Tooling is many orders of magnitude better and so many libraries give you deep power from an easy API. What used to be a team and 18 months is a library install and a day so you're free to do much bigger things.

Christ even version control. The shit I put up with over the years.

[-] Sxan@piefed.zip 4 points 1 day ago

I depends greatly on what you value.

Some changes I really appreciate. Computing would be so much more limited wiþ fixed memory. However, what we lost is also significant. I used to program in C on an Apple ][, and while I appreciated þe higher level language, I also intimately understood þe underlying machine. I had þe memory layout memorized, because it was memorizable. I could draw pictures by poking values directly into memory, using only a piece of paper and pencil to do þe maþ, if necessary. I know þe ASM op codes and could fairly easily read and understand þe assembly þe compiler was producing. Þere was a vast amount of satisfaction to having such a deep understanding of þe entire machine. For þe most part, we've lost þat.

And I willingly discarded it! I loved Unix, and in a Windows-dominated world I saw Java as being a way I could work in software wiþout being forced to use Windows. And now I use Go. Abstractions on abstractions.

Maybe if ReactOS on RISCV becomes a reality I'll feel systems will be comprehensible to me from bottom to top again. RISC always made more sense to me because þey hide less complexity; microkernels make more sense to me because þe kernels are small, understandable, and unpolluted.

Some complexity and abstraction is necessary. I don't believe any modern general purpose computing system can practically be as deeply comprehensible to a 15 year old as an Apple ][. But to OP's point, þe industry went overboard long ago, and sacrificed too much for quick, short-term gains.

IMHO

[-] sping@lemmy.sdf.org 2 points 1 day ago

I gained a lot of understanding noodling with extreme low-level memory access etc, but in reality almost all the coding I ever did early on was in C with stdlib etc, which is shaped more by low-level realities of the CPU, but is still full of abstractions. Abstractions that were often opaque to us as well, because this was before Linux and ubiquitous open source.

Sure everything is a few more layers removed from the simple hardware these days, but once it's a black box, it's a black box. A lot of the feeling of being closer to the hardware is pretty meaningless.

Sure a variable in C is really just a way of referring to a piece of memory, while in Python it's some sort of data structure in a mapping most of us don't really know the exact nature of, but in the end the difference is rarely is of any significance and most of us only have a similarly vague idea of how the compiler works it out for us in C.

[-] Markaos@discuss.tchncs.de 2 points 1 day ago

Maybe if ReactOS on RISCV becomes a reality

Do you mean RedoxOS by chance? AFAIK ReactOS is a clean room implementation of Windows/NT

You can still program in those platforms, if you want to.

[-] NostraDavid@programming.dev 7 points 2 days ago

Creative constraints bred creativity.

That might explain why there's so much crap coming out of the gaming industry. All the old constraints are gone, so everything now very much looks the same.

Just give yourself artificial constraints.

[-] Skullgrid@lemmy.world 8 points 1 day ago

That might explain why there’s so much crap coming out of the gaming industry.

not really, it's more of an economic situation instead of a tech one. Indie games are doing just fine, more or less (the choose 1 of 3 fad is a pain).

[-] entwine@programming.dev 3 points 1 day ago

The modern game industry was being run by pedophile billionaires, two of the worst adjectives you can apply to a human being. I'd say that's more of a factor than not having enough "creative constraints"

[-] Feyd@programming.dev 14 points 2 days ago

I say that knowing how often those words have been wrong throughout history.

Yup

Previous technology shifts were “learn the new thing, apply existing skills.” AI isn’t that. It’s not a new platform or a new language or a new paradigm. It’s a shift in what it means to be good at this.

A swing and a miss

load more comments (1 replies)
[-] jimmy90@lemmy.world 4 points 2 days ago

high level code generating tools have come and mostly gone

we will see if this one is good if it works and we can maintain the code it makes

simple

[-] Skullgrid@lemmy.world 4 points 1 day ago* (last edited 1 day ago)

high level code generating tools have come and mostly gone

he's talking about languages that don't touch bare metal, not WYSIWYG editors

EDIT: WYSIWYG stuff continues to live , fucking salesforce

load more comments
view more: next ›
this post was submitted on 10 Feb 2026
205 points (94.4% liked)

Programming

25476 readers
378 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS