The person who uses the shitty tool is a moron. The person who makes the shitty tool is an asshole. At least in this case where the shitty tool is actively promoting shitty PRs.
The con is that it’s not very powerful. I haven’t attempted to code on a gaming handheld, but I’ve had issues with a midrange laptop being under powered. RAM is probably the biggest issue. My life improved noticeably when I upgraded my main machine to 64 GB. Granted I was doing particularly heavy work. It really depends on what you’re doing. You could get away with it for some work, but it’s going to be painfully slow for other stuff.
AppArmor is part of the kernel. Why does it require patches?
I'm interpreting that as clickbait - just something they added to the title to drive traffic.
You don't have to be a full stack dev for that to happen to you
I find it very hard to believe that AI will ever get to the point of being able to solve novel problems without a fundamental change to the nature of "AI". LLMs are powerful, but ultimately they (and every other kind of "AI") are advanced pattern matching systems. Pattern matching is not capable of solving problems that haven't been solved before.
GitLab already has stellar CI/CD, far superior to GitHub Actions IMO
The point is that Slack does not take advantage of Electron at all. It’s no better than running it in a browser.
The point isn't whether you use the GUI. The point is whether you are capable of doing your job without it. I'm not going to throw shade but personally I hate being at someone else's mercy - such as when the GUI breaks and I am forced to wait for someone else to fix it. One reason I stay away from the JavaScript browser/electron ecosystem is because there are so many opaque, inscrutable tools (namely bundlers and module resolvers) and I have no freaking clue how they work under the hood and they're virtually impossible to debug.
Makes sense. The most programming I've ever done for a GPU was a few simple shaders for a toy project.
If you want your code to run on the GPU, the complete viability of your code depend on it.
Because of the performance improvements from vectorization, and the fact that GPUs are particularly well suited to that? Or are GPUs particularly bad at branches.
it is only one of the many micro-optimization techniques you can do to take a few nanoseconds from an inner loop.
How often do a few nanoseconds in the inner loop matter?
The thing to keep in mind is that there is no such thing as “average developer”. Computing is way too diverse for it.
Looking at all the software out there, the vast majority of it is games, apps, and websites. Applications where performance is critical, such as control systems, operating systems, databases, numerical analysis, etc, are relatively rare compared to apps/etc. So statistically speaking the majority of developers must be working on the latter (which is what I mean by an "average developer"). In my experience working on apps there are exceedingly few times where micro-optimizations matter (as in things like assembly and/or branchless programming as opposed to macro-optimizations such as avoiding unnecessary looping/nesting/etc).
Edit: I can imagine it might matter a lot more for games, such as in shaders or physics calculations. I've never worked on a game so my knowledge of that kind of work is rather lacking.
That line is blurring to the point where it barely exists any more. Compiled languages are becoming increasingly dynamic (e.g. JIT compilation, code generation at runtime) and interpreted languages are getting compiled. JavaScript is a great example: V8 uses LLVM (a traditional compiler) to optimize and compile hot functions into machine code.
IMO the only definition of “real” programming language that makes any sense is a (Turing complete) language you can realistically build production systems with. Anything else is pointlessly pedantic or gatekeeping.