“Assume it’s a map and treat like a map and then catch the type error if it’s not.” Paraphrased from actual advice by Guido on how you should write Python. Python isn’t a bad language but the philosophy that comes along with it is so fucked.
The point is that AI stands for “artificial intelligence” and these systems are not intelligent. You can argue that AI has come to mean something else, and that’s a reasonable argument. But LLMs are nothing but a shitload of vector data and matrix math. They are no more intelligent than an insect is intelligent. I don’t particularly care about the term “AI” but I will die on the “LLMs are not intelligent” hill.
I am aware of that, but Java is the most popular language that runs on the JVM. I don’t specifically dislike other JVM languages, though one of my issues is type erasure and that’s partially a limitation of the JVM.
Better to not have version control!? Dear god I hope I never work on anything with you.
You’re also a programming language design nerd? Like, “Compare the features of language A to those of language B”, or nerding out about the underlying mechanics of things like generic types, virtual method dispatch, and no-stop garbage collection? I thought I was the only one. Well not the only one but it doesn’t seem that popular of a thing to nerd out over.
Maybe these days. That definitely was not true when I was growing up, or even a decade ago.
GitLab. You can use their SaaS offering (gitlab.com) or run the open source version on your own server(s).
Most experienced developers already agree with you
If over-worked, he needs to talk to his manager or whoever the work is coming from and tell them they need to slow it down
I understand the principles, how branch prediction works, and why optimizing to help out the predictor can help. My question is more of, how often does that actually matter to the average developer? Unless you're a developer on numpy, gonum, cryptography, digital signal processing, etc, how often do you have a hot loop that can be optimized with branchless programming techniques? I think my career has been pretty average in terms of the projects I've worked on and I can't think of a single time I've been in that situation.
I'm also generally aggravated at what skills the software industry thinks are important. I would not be surprised to hear about branchless programming questions showing up in interviews, but those skills (and algorithm design in general) are irrelevant to 99% of development and 99% of developers in my experience. The skills that actually matter (in my experience) are problem solving, debugging, reading code, and soft skills. And being able to write code of course, but that almost seems secondary.
I've been working primarily in Go for the past five years, including some extremely complex projects, and I have never once wished I had dependency injection. It has been wonderful. I have used dependency injection - previously I worked on a C# project for years, and that used DI - but I adore Go's simplicity and I never want to use anything else (except for JS for UI, via Electron or Wails for desktop).
And relying on runtime validation is a horrific way to write production code