[-] lysdexic@programming.dev 8 points 9 months ago* (last edited 9 months ago)

Well, auto looks just like var in that regard.

It really isn't. Neither in C# nor in Java. They are just syntactic sugar to avoid redundant type specifications. I mean things like Foo foo = new Foo();. Who gets confused with that?

Why do you think IDEs are able to tell which type a variable is?

Even C# takes a step further and allows developer to omit the constructor with their target-typed new expressions. No one is whining about dynamic types just because the language let's you instantiate an object with Foo foo = new();.

[-] lysdexic@programming.dev 8 points 1 year ago

There are no hard set rules, and it depends on what uses you have for the build number.

Making it a monotonically increasing number helps with versioning because it's trivial to figure out which version is newer. Nevertheless, you can also rely on semantic versioning for that. It's not like all projects are like Windows 10 and half a dozen major versions are pinned at 10.0.

You sound like you're focusing on the wrong problem. You first need to figure it what is your versioning strategy,and from there you need to figure out if a build number plays any role on it.

[-] lysdexic@programming.dev 8 points 1 year ago

So how fucked am I for starting to learn cpp as my first language, or is this a later down the road thing to worry about?

I don't see why you should be concerned, except that no professional software developer is limited to use one specific programming language.

Even if you pay attention to the disaster prophets in the crowd, which are mainly comprised of fanboys implicitly and explicitly promoting their pet language/frameworks, C++ dominates all aspects of the computing ecosystem, which means that in the very least the whole world needs to maintain existing C++ projects to continue to work. See COBOL for reference.

[-] lysdexic@programming.dev 9 points 2 years ago

From the whole blog post, the thing that caught my eye was the side remark regarding SPAs vs MPAs. It was one of those things that people don't tend to think about it but once someone touches on the subject, the problem become obvious. It seems that modern javascript frameworks focus on SPAs and try to shoehorn the concept everywhere, even when it clearly does not fit. Things reached a point where rewriting browser history to get that SPA to look like a MPA is now a basic feature of multiple pages, and it rarely works well.

Perhaps it's too extreme to claim that MPAs are the future, but indeed there are a ton of webapps that are SPAs piling on complexity just to masquerade as MPAs.

1
12
40
18
125
Vim 9.1 released (www.vim.org)
21
12
Functional Classes (2023) (blog.cleancoder.com)
41
41
3
Urchin Tracking Module (UTM) (support.google.com)
1
[-] lysdexic@programming.dev 8 points 2 years ago

Perhaps I'm being dense and coffee hasn't kicked in yet, but I fail to see where is this new computing paradigm that's mentioned in the title.

From their inception, computers have been used to plug in sensors, collect their values, and use them to compute stuff and things. For decades each and every single consumer-grade laptop has adaptive active cooling, which means spinning fans and throttling down CPUs when sensors report values over a threshold. One of the most basic aspects of programming is checking if a memory allocation was successful, and otherwise handle an out-of-memory scenario. Updating app states when network connections go up or down is also a very basic feature. Concepts like retries, jitter, exponential back off have become basic features provided by dedicated modules. From the start Docker provided support for health checks, which is basically am endpoint designed to be probed periodically. There are also canary tests to check if services are reachable and usable.

These exist for decades. This stuff has been done in production software since the 90s.

Where's the novelty?

39
[-] lysdexic@programming.dev 9 points 2 years ago* (last edited 2 years ago)

you meant that the focus of the change wasn’t GH

They are dropping Mercurial and focusing on Git. Incidentally, they happen to host the Git project on GitHub. GitHub is used for hosting, and they don't even use basic features such as pull requests.

Again, this is really not about GitHub at all.

[-] lysdexic@programming.dev 8 points 2 years ago

Github for organizations becomes rather expensive rather quickly (...)

I'm not sure if that's relevant. GitHub's free plan also supports GitHub organizations, and GitHub's Team plan costs only around $4/(developer*month). You can do the math to check how many developers you'd have to register in a GitHub Team plan to match the operational expense of hiring a person to manage a self-hosted instance from 9-to-5.

[-] lysdexic@programming.dev 8 points 2 years ago* (last edited 2 years ago)

I think you’re missing the point. It’s exactly cause Microsoft created it that people get worried about it.

I don't think there is any merit to that concern. Not only is TypeScript FLOSS, Microsoft also has an excellent track record developing high-quality programming languages and tech stacks. Take for example C#. It's been around for over two decades and if anything it's getting better by the release.

I understand the rationale behind the concern, but there is also a factor of mindlessly parroting cliches.

[-] lysdexic@programming.dev 9 points 2 years ago

You’re right that that’s extremely unambiguous, but I still don’t love the idea that users don’t get to decide what’s in $HOME, like, maybe we could call it “$STORAGE_FOR_RANDOM_BULLSHIT” instead?

That's basically what $HOME is is used for in UNIX: a place for applications to store user-specific files, including user data and user files.

https://www.linfo.org/home_directory.html

If anything in computing conventions implies “user space” it’s a global variable named HOME. And it makes sense that there should be a $STORAGE_FOR_RANDOM_BULLSHIT location too - but maybe not the same place?

UNIX, and afterwards Unix-like OSes, were designed as multi-user operating systems that supported individual user accounts. Each user needs to store it's data, and there's a convenient place to store it: it's $HOME directory. That's how things have been designed and have been working for close to half a century.

Some newer specs such as Freedesktop's directory specification build upon the UNIX standard and Unix-like tradition, but the truth of the matter is that there aren't that many reasons to break away from this practice.

[-] lysdexic@programming.dev 8 points 2 years ago

I was expecting a huge CICD pipeline with delivery steps automatically publishing packages for Debian Unstable and Debian Testing, but it turns out it's a relatively simple pipeline with straight-forward build and testing stages.

Less is definitely more, and targeting Debian alone does greatly simplify a pipeline, but somehow I was expecting more automation taking care of the whole workflow.

[-] lysdexic@programming.dev 8 points 2 years ago

This was the first time I saw someone refer to Python's type hints as a performance tool. Up until now, I only saw references to type hints as a way to help static code analyzer tools verify that objects and invocations comply with contracts.

I guess that having additional info at hand to determine how some calls are expected to be made is helpful to gather info to drive optimization steps, but PEP 484 is clear in stating that it's goal is to help type checkers, and that code generation using type hints might be limited to some contexts.

This sounds like yet another example supporting the old law of interfaces, where all it takes for an interface to be abused is for it to exist.

[-] lysdexic@programming.dev 8 points 2 years ago

but we can agree on which of two implementations is shorter.

Shortness for the sake of being short sounds like optimizing for the wrong metric. Code needs to be easy to read, but it's more important that the code is easy to change and easy to test. Inline code and function calls are renowned to render code untestable, and introducing abstract classes and handles is a renowned technique to stub out dependencies.

view more: ‹ prev next ›

lysdexic

joined 2 years ago
MODERATOR OF