[-] philm@programming.dev 2 points 11 months ago

Nah, it's not, code is modular (IME should be kinda tree-structured), a book is linear.

So the API should be in your analogy the synopsis. And I haven't said, that there shouldn't be any comments. E.g. doc-comments above functions, explaining the use-cases and showing examples are good practice.

[-] philm@programming.dev 2 points 1 year ago

Behold, Rust is blazingly fast in approaching the most popular language :)

Also, in comparison to Python you can do pretty much everything throughout the stack, which would be the reason I would go for Rust (not mentioning all the other niceties it has to offer).

Also learning Rust nowadays is much more approachable compared to say 7+ years back (ecosystem got way better)

[-] philm@programming.dev 2 points 1 year ago* (last edited 1 year ago)

I think it's not that bad yet, when comparing with npm. Usually the dependencies I use are of very high quality. But I'm also very selective with dependencies. I'm rather writing a simple part myself, than using a not-really maintained low-quality dependency...

Btw. I have not looked into the Hare language yet (will do that now), but if it's similar as deno, I won't like it. You want to have some kind of package management IME...

[-] philm@programming.dev 2 points 1 year ago

Not without a super fancy type system that has to be still found. I think the key issue is cyclic data-structures (e.g. doubly-linked list). The language somehow needs to have strong/weak pointers and automatically determining them is a very complex research question...

[-] philm@programming.dev 2 points 1 year ago

Yeah lemmy-ui doesn't escape/sanitizes properly...

[-] philm@programming.dev 2 points 1 year ago

OCI images and CI/CD to buid the image

Actually since I just had a similar issue at work. I fought a little bit with the traditional Docker pipeline, and then discovered this: https://mitchellh.com/writing/nix-with-dockerfiles, which not only solved my problem much faster, but is also more efficient, since only the actual dependencies of the package are in there (and it can be really reproducible). So you can actually combine the best pieces of each technologies (Docker - sandboxing/containerizing, Nix - packaging and configuration).

Btw. Nix is rapidly growing (since flakes mostly), so I think a slow shift towards Nix is happening already.

But I agree, migrating traditional dotfiles to Nix+home-manager takes time. I did it incrementally (I used activation scripts to link directly to the old config files, and slowly converted the old config to Nix.

[-] philm@programming.dev 2 points 1 year ago

Interesting.

I have settled with NixOS+home-manager. It got quite a bit better over the last 3 1/2 years.

After learning and understanding all the quirks of NixOS, I've got a super nicely configured system, that just gets better over time, not comparable to a(ny) different distro I know of (they kinda degrade after time IME). I really like the way I can compose all my configuration via functions in modules and reuse relevant stuff on different servers and desktops I use (via flakes), always with the same up-to-date configuration.

[-] philm@programming.dev 2 points 1 year ago

I'm mostly using ChatGPT4, because I don't use vscode (helix), and as far as I could see it from colleagues, the current Copilot(X) is not helpful at all...

I'm describing the problem (context etc.), maybe paste some code there, and hope that it gets what I mean, when it doesn't (which seems to be rather often), I'll try to help it with the context it hasn't gotten, but it very often fails, unless the code stuff is rather simple (i.e. boilerplaty). But even if I want the GPT4 to generate a bunch of boilerplate, it introduces something like // repeat this 20 times in between the code that it should actually generate, and even if I tell it multiple times that it should generate the exact code, it fails pretty much all the time, also with increased context size via the API, so that it should actually be able to do it in one go, the gpt4-0314 model (via the API) seems to be a bit better here.

I'm absolutely interested where this leads, and I'm the first that monitors all the changes, but right now it slows me down, rather than really helping me. Copilot may be interesting in the future, but right now it's dumb as fu... I'm not writing boilerplaty code, it's rather complex stuff, and it fails catastrophically there, I don't see that this will change in the near future. GPT4 got dumber over the course of the last half year, it was certainly better at the beginning. I can remember being rather impressed by it, but now meh...

It's good for natural language stuff though, but not really for novel creative stuff in code (I'm doing most stuff in Rust btw.).

But GPT5 will be interesting. I doubt, that I'll really profit from it for code related stuff (maybe GPT6 then or so), but we'll see... All the other developments in that space are also quite interesting. So when it's actually viable to train or constrain your own LLM on your own bigger codebase, such that it really gets the details, and gives actual helpful suggestions, (e.g. something like the recent CodeLlama release) this stuff may be more interesting for actual coding.

I'm not even letting it generate comments (e.g. above functions) because it's kinda like this currently (figurative, more fancy but wordy, and not really helpful)

// this variable is of type int
let a = 8;
[-] philm@programming.dev 2 points 1 year ago

I just default to recursive descent parsers (with pratt parsing), simple, efficient, great error messages and almighty (CFGs). For quick prototyping I really like to use https://github.com/zesterer/chumsky currently (pratt parsing was just added, need to try that out again).

But writing a parser generator is certainly an interesting academic task.

[-] philm@programming.dev 2 points 1 year ago

You might want to checkout NixOS (or home-manager if you don't want a cold deep dive into a rabbit-hole).

[-] philm@programming.dev 2 points 1 year ago

I agree, that having a consistent process and good engineers is definitely most important, but a language itself definitely can guide you in the right direction. I think ironically Rust and C++ are good vice versa examples (unrelated to their target area, which happens to be the same (systems programming)), C++ has zillion ways to program in, finding the right and best way is definitely no easy task and requires massive experience in all kinds of paradigms, while Rust generally promotes you to do things in one/the "right" (IMHO) way, otherwise the borrow-checker annoys you all the time.

[-] philm@programming.dev 2 points 1 year ago

I consider myself a relatively experienced Rust programmer (5+ years experience, 2-3 years professional).

I think it excels in most aspects, but advanced "type-meta-programming" and currying. But I think even this may be a good thing, because often higher-kinded types and lazy-evaluation is either mental overhead or leads to unpredictable performance (looking at you Haskell), so that the programmer doesn't get the idea to get too fancy, when the issue at hand is much more simple. Also often the syntax is a little bit boilerplaty compared to something like Haskell, but also this could be seen as a good thing, as everything is "documented" within it, IMHO it feels easier to read that way (as long not, something isn't hidden behind a crazy (proc-)macro (which still can be a good thing, but should be used carefully)).

Non the less I think it could improve with more advanced type-metaprogramming (and inference), because often you have like 5 lines of trait-bounds that are not really interesting for the end-user and should IMHO be inferenced (and communicated by the compiler otherwise). But I think after reading a few articles of compiler devs (like https://smallcultfollowing.com/babysteps/blog/2022/09/22/rust-2024-the-year-of-everywhere/) I think this will indeed improve soon, and it did already considerably over the time I'm programming in Rust.

view more: ‹ prev next ›

philm

joined 1 year ago