154
top 50 comments
sorted by: hot top controversial new old
[-] Ashelyn@lemmy.blahaj.zone 4 points 1 day ago

You always think you remember how to center a div until you try to do it again after a few years

[-] ChairmanMeow@programming.dev 14 points 2 days ago

Meh, those are just the programmers that are remembered.

They did lots of dumb shit too. Mario 64 was a super-innovative game at the time with its free 3D platforming. There's also tons of weird code in there, and the developers also fucked up by shipping a debug build of the game, costing a not insignificant amount of performance.

[-] silasmariner@programming.dev 3 points 2 days ago

Is that true about the debug build? I had it on the N64 way back when and don't remember it being especially laggy. OTOH I was young, and relatively shit at computer games

[-] ChairmanMeow@programming.dev 6 points 1 day ago

Yup, they shipped a debug build. Here's a video that shows the build side-by-side with one that was compiled with compiler optimizations: https://youtu.be/9_gdOKSTaxM

It was quite laggy in certain areas, particularly the submarine sank the framerate quite considerably.

[-] silasmariner@programming.dev 2 points 1 day ago* (last edited 1 day ago)

Thanks for the video. Yeah ok actually ~~clankers cavern~~ (no wait that was banjo kazooie, I meant that one with all the sea serpents) was hella laggy, you've brought it all back

[-] Piemanding@sh.itjust.works 1 points 1 day ago

I heard somewhere more recently that they probably did it on purpose because they didn't know if the game would be stable using the different settings. Nintendo was known for quality back then so if the game crashed, even a bit more, they thought it would hurt their bottom line.

[-] ChairmanMeow@programming.dev 1 points 1 day ago

True, there were several programming mistakes that caused undefined behaviour. Most of these the compiler warns about though, so they could have easily been fixed.

The issues were "masked" so to speak by the debug build (even if not fully gone, the game could still crash). But decompiling the game let modders fix those issues fairly easily, after which it could be recompiled with the proper optimizations.

[-] admin@sh.itjust.works 34 points 2 days ago

Programmers of the past didn't have to work 2 jobs to be able to buy a house and live comfortably, less alone spend most of their paycheck on inflated prices for gas, food, services, etc. But hey we got AI and Amazon Prime now.

[-] madame_gaymes@programming.dev 22 points 2 days ago

Not to mention the sense of pioneering something. Like I imagine calculating the Moon landing is something you're happy to spend 80+ hours a week on, especially when basic needs are taken care of.

[-] admin@sh.itjust.works 14 points 2 days ago

Not to mention the sense of pioneering something.

something you're happy to spend 80+ hours a week on, especially when basic needs are taken care of.

Vs writing the same thing that already exists with a different front end, a bunch of times with different examples, because somebody who has more resources (not just more cash, but also time) decides visuals are more important than real functionality.

Part of the reason I don't like 'coding' or developing software its because is so dreadful and feels overly stupid when an open source alternative works better than what you'd be able to make before the deadline.

I just rather be a CTO / SysAdmin and live happier.

[-] simontherockjohnson@lemmy.ml 7 points 2 days ago* (last edited 2 days ago)

Margaret Hamilton's first job out of undergrad was working for Lorenz. She was incredibly accomplished with several stints in top labs, by the time of Apollo. It's not like opportunities for trail blazing software fell out of the sky on shlubs who barely passed undergrad data structures and algorithms courses.

[-] admin@sh.itjust.works 9 points 2 days ago

shlubs who barely passed undergrad data structures and algorithms courses.

And that's the problem with most people getting into IT nowadays, they expect to go to an algorithms course or a development bootcamp and come out knowing everything to make a 6 figure salary, but don't even try to learn what a software dependency is or how to fix their dev environment and expect GPT to shlub it up, when in reality many of these old school software programmers were self learning nerds who were just trying to solve (a) problem, and spent hours doing so.

[-] simontherockjohnson@lemmy.ml 8 points 2 days ago* (last edited 2 days ago)

don’t even try to learn what a software dependency

Everyone at my company keeps using the term "dependency hell" when referring to literally dependency management and order of operations with a modern package manager like NPM that tracks versions and dependencies.

They've literally never experienced working with dynamically linked libraries and they think it's so hard because they have to understand a tree that exists in data form (e.g. package-lock.json) and can be easily visualized vs a tangled file system and LD_LIBRARY_PATH or Windows standard search order / HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\KnownDLLs.

It's pathetic.

[-] admin@sh.itjust.works 2 points 2 days ago* (last edited 2 days ago)

These guys are living in the glory. I bet they don't even know all the info they need is just in a fucking config file, in a damn manual somewhere or in the stupid docs that people doesn't seem to bother reading anymore, or writing some decent ones.

[-] HiddenLayer555@lemmy.ml 16 points 2 days ago* (last edited 2 days ago)

I get what this is saying but on the other hand...

Programmers now:

πŸ’ͺ Can spin up a minimum viable product in a day

πŸ’ͺ Writes web applications that handle millions or even billions of requests per second

πŸ’ͺ Remote code execution and memory related vulnerabilities are rarer than ever now

πŸ’ͺ Can send data across the world with sub 1 second latency

πŸ’ͺ The same PCIe interface is now 32x faster (16x PICe 1 was 8GB/s, while PCIe 6 is 256GB/s)

πŸ’ͺ The same wireless bands now have more throughput due to better radio protocols and signal processing

πŸ’ͺ Writes applications that scale across the over 100 cores of modern top of the line processors

πŸ’ͺ JIT and garbage collection techniques have improved to the point where they have a nearly imperceptible performance impact in the majority of use cases

πŸ’ͺ Most bugs are caught by static analysis and testing frameworks before release

πŸ’ͺ Codebases are worked on by thousands of people at the same time

πŸ’ͺ Functional programming, which is arguably far less bug prone, is rapidly gaining traction as a paradigm

πŸ’ͺ Far more emphasis on immutability to the point where many languages have it as the default

πŸ’ͺ Virtual machines can be seamlessly transferred from one computer to another while they're running

πŸ’ͺ Modern applications can be used by people anywhere in the world regardless of language, even things that were very difficult to do in the past like mirroring the entire interface to allow an application that was written for left to right languages to support right to left

πŸ’ͺ Accessibility features allow people who are blind, paralyzed, or have other disabilities to use computers just as well as anyone else

Just wanted to provide come counter examples because I'm not a huge fan of the "programmers are worse than they were back in the 80s" rethoric. While programmers today are more reliant on automated tools, I really disagree that programmers are less capable in general than they were in the past.

[-] yogthos@lemmy.ml 16 points 2 days ago

For sure, it's a lot easier to do a lot of stuff today than before, but the way we build software has become incredibly wasteful as well. Also worth noting that some of the workflows that were available in languages like CL or Smalltalk back in the 80s are superior to what most languages offer today. It hasn't been strictly progress in every regard.

I'd say the issue isn't that programmers are worse today, but that the trends in the industry select for things that work just well enough, and that's how we end up with stuff like Electron.

[-] HiddenLayer555@lemmy.ml 4 points 2 days ago

Also worth noting that some of the workflows that were available in languages like CL or Smalltalk back in the 80s are superior to what most languages offer today.

In what ways? I don't have any experience with those so I'm curious.

[-] yogthos@lemmy.ml 10 points 2 days ago

Common Lisp and Smalltalk provided live development environment where you could run any code as you write it in the context of your application. Even the whole Lisp OS was modifiable at runtime, you could just open code for any running application or even the OS itself, make changes on the fly, and see them reflected. A fun run through Symbolics Lisp Machine here https://www.youtube.com/watch?v=o4-YnLpLgtk

Here are some highlights.

The system was fully introspective and self-documenting. The entire OS and development environment was written in Lisp, allowing deep runtime inspection and modification. Every function, variable, or object could be inspected, traced, or redefined at runtime without restarting. Modern IDEs provide some introspection (e.g., via debuggers or REPLs), but not at the same pervasive level.

You had dynamic code editing & debugging. Functions could be redefined while running, even in the middle of execution (e.g., fixing a bug in a running server). You had the ability to attach "before," "after," or "around" hooks to any function dynamically.

The condition system in CL provided advanced error handling with restarts allowed interactive recovery from errors (far beyond modern exception handling).

Dynamic Window System UI elements were live Lisp objects that could be inspected and modified interactively. Objects could be inspected and edited in structured ways (e.g., modifying a list or hash table directly in the inspector). Modern IDEs lack this level of direct interactivity with live objects.

You had persistent image-based development where the entire system state (including running programs, open files, and debug sessions) could be saved to an image and resumed later. This is similar to Smalltalk images, but unlike modern IDEs where state is usually lost on restart.

You had knowledge-level documentation with Document Examiner (DOCX) which was hypertext-like documentation system where every function, variable, or concept was richly cross-linked. The system could also generate documentation from source code and comments dynamically. Modern tools such as Doxygen are less integrated and interactive.

CL had ephemeral GC that provided real-time garbage collection with minimal pauses. Weak references and finalizers are more sophisticated than most modern GC implementations. Modern languages (e.g., Java, Go, C#) have good GC but lack the fine-grained control of Lisp Machines.

Transparent Remote Procedure Calls (RPC) allowed Objects to seamlessly interact across machines as if they were local. Meanwhile NFS-like but Lisp-native file system allowed files to be accessed and edited remotely with versioning.

Finally, compilers like Zeta-C) could compile Lisp to efficient machine code with deep optimizations.

[-] Natanox@discuss.tchncs.de 6 points 2 days ago

No wonder there are some older developers who defend Lisp so passionately. Sounds like a dream to work with once you got the hang of it.

[-] yogthos@lemmy.ml 2 points 2 days ago

It's really impressive to think what was achieved with such limited hardware compared to today's standards. While languages like Clojure are rediscovering these concepts, it feels like we took a significant detour along the way.

I suspect this has historical roots. In the 1980s, Lisp was primarily used in universities and a small number of companies due to the then-high hardware demands for features like garbage collection, which we now consider commonplace. Meanwhile, people who could afford personal computers were constrained by very basic hardware, making languages such as C or Fortran a practical choice. Consequently, the vast majority of developers lacked exposure to alternative paradigms. As these devs entered industry and academia, they naturally taught programming based on their own experiences. Hence why the syntax and semantics of most mainstream languages can be traced back to C.

[-] HiddenLayer555@lemmy.ml 4 points 2 days ago

Interesting! Thank you!

[-] davel@lemmy.ml 3 points 2 days ago

I had access to a Symbolics machine back in the day, but I was too young & dumb to understand or appreciate what I had my hands on. Wasted opportunity πŸ˜”

[-] yogthos@lemmy.ml 2 points 2 days ago

It's like an artifact from an ancient and more advanced civilization. :)

[-] simontherockjohnson@lemmy.ml 2 points 2 days ago* (last edited 2 days ago)

I love Lisp, that's why I like doing industry work in JS, because it's very lisp like.

However if you gave an average industry programmer Lisp today they'd fuck up so much worse than the garbage enterprise grade language code that exists today. I switch jobs probably every 4 years and on average I teach 3 people a year what a closure is.

Lisp has a lot of great solutions for a lot of real problems, but these people quite literally fix one bug and create 3. I had a 10+ YOE Tech Lead tell me the other day that they kinda just ignore the error output of the TS compiler and it made me want to tear my eyes out.

load more comments (16 replies)
load more comments (3 replies)
load more comments (1 replies)
[-] NigelFrobisher@aussie.zone 7 points 2 days ago

I’m still trapped in vim 25 years on.

[-] moseschrute@lemmy.ml 3 points 2 days ago

Let’s just say hypothetically I vibe coded my vim config. Where would that put me?

[-] folaht@lemmy.ml 1 points 2 days ago

In the past I'd be forever stuck without Stackoverflow to help me.
I couldn't get out of vim without a miracle.
Pointers were so confusing, I'd go without them.

load more comments
view more: next β€Ί
this post was submitted on 07 May 2025
154 points (94.3% liked)

Programmer Humor

35561 readers
252 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS