[-] firelizzard@programming.dev 6 points 2 months ago

How can you tell?

[-] firelizzard@programming.dev 8 points 2 months ago

And relying on runtime validation is a horrific way to write production code

[-] firelizzard@programming.dev 6 points 6 months ago

Or use a statically typed language that’s actually modern instead of C

[-] firelizzard@programming.dev 7 points 7 months ago

If the ask is, why was the hardware like that in the first place the answer is because it can’t be fully validated.

But that's not the question. There are two questions: Who should be responsible for patching hardware vulnerabilities? And if the answer is "the kernel" then should speculative but never demonstrated vulnerabilities be patched? Linus' answer is the hardware manufacturer, and no.

Is this really the hardware vendor’s problem though? It’s the consumers problem.

Maybe we're running into the ambiguity of language. If you mean to say, "Who does it cause a problem for? The consumer." then sure. On the other hand what I mean, and what I think Linus means, is "Who's responsible for the vulnerability existing? Hardware vendors. Who should fix it? Hardware vendors."

If the ask is why should a speculative fix go into the Kernel [...]

Depends on what you/we/they mean by "speculative". IMO, we need to do something (microcode, kernel patches, whatever) to patch Spectre and Meltdown. Those have been demonstrated to be real vulnerabilities, even if no one has exploited them yet. But "speculative" can mean something else. I'm not going to read all the LMK emails so maybe they're talking about something else. But I've seen plenty of, "Well if X, Y, and Z happen then that could be a vulnerability." For that kind of speculative vulnerability, one that has not been demonstrated to be a real vulnerability, I am sympathetic to Linus' position.

[-] firelizzard@programming.dev 6 points 8 months ago

How are you using it for data crunching? That's an honest question, based on my experiences with AI I can't imagine how I'd use them to crunch data.

So I always have to check it’s work to some degree.

That goes without saying. Every AI I've seen or heard of generates some level of garbage.

[-] firelizzard@programming.dev 7 points 8 months ago

The point is that AI stands for “artificial intelligence” and these systems are not intelligent. You can argue that AI has come to mean something else, and that’s a reasonable argument. But LLMs are nothing but a shitload of vector data and matrix math. They are no more intelligent than an insect is intelligent. I don’t particularly care about the term “AI” but I will die on the “LLMs are not intelligent” hill.

[-] firelizzard@programming.dev 6 points 1 year ago

I think the word you want is minutiae?

[-] firelizzard@programming.dev 7 points 1 year ago

You’re also a programming language design nerd? Like, “Compare the features of language A to those of language B”, or nerding out about the underlying mechanics of things like generic types, virtual method dispatch, and no-stop garbage collection? I thought I was the only one. Well not the only one but it doesn’t seem that popular of a thing to nerd out over.

[-] firelizzard@programming.dev 6 points 1 year ago

I could use Google but I’m looking for opinions not just what journals have that kind of content

[-] firelizzard@programming.dev 7 points 2 years ago

If over-worked, he needs to talk to his manager or whoever the work is coming from and tell them they need to slow it down

[-] firelizzard@programming.dev 7 points 2 years ago

I understand the principles, how branch prediction works, and why optimizing to help out the predictor can help. My question is more of, how often does that actually matter to the average developer? Unless you're a developer on numpy, gonum, cryptography, digital signal processing, etc, how often do you have a hot loop that can be optimized with branchless programming techniques? I think my career has been pretty average in terms of the projects I've worked on and I can't think of a single time I've been in that situation.

I'm also generally aggravated at what skills the software industry thinks are important. I would not be surprised to hear about branchless programming questions showing up in interviews, but those skills (and algorithm design in general) are irrelevant to 99% of development and 99% of developers in my experience. The skills that actually matter (in my experience) are problem solving, debugging, reading code, and soft skills. And being able to write code of course, but that almost seems secondary.

[-] firelizzard@programming.dev 7 points 2 years ago

I've been working primarily in Go for the past five years, including some extremely complex projects, and I have never once wished I had dependency injection. It has been wonderful. I have used dependency injection - previously I worked on a C# project for years, and that used DI - but I adore Go's simplicity and I never want to use anything else (except for JS for UI, via Electron or Wails for desktop).

view more: ‹ prev next ›

firelizzard

joined 2 years ago