[-] FizzyOrange@programming.dev 16 points 5 months ago

Yeah there are pros and cons. Desktop apps are not sandboxed. Mobile apps are often missing features and are annoying to install. Websites often have poor performance or janky UX on mobile, and you need to be online, and you don't have control of their availability.

I think the best option depends on what the thing is - ordering food from a random pub? Web site. Video editing? App.

[-] FizzyOrange@programming.dev 16 points 5 months ago

Also note the drop in Chrome OS mirrors the rise in Linux so I wouldn't rule out this just being user agent changes.

[-] FizzyOrange@programming.dev 16 points 6 months ago* (last edited 6 months ago)

Presumably because Forgejo didn't have CI support until extremely recently. And because Jenkins is trash.

[-] FizzyOrange@programming.dev 16 points 8 months ago

Be thankful we got Javascript. We might have had TCL! 😱

Interesting footnote: the founding of Netscape occurred at the same time I was deciding where to go in industry when I left Berkeley in 1994. Jim Clarke and Marc Andreessen approached me about the possibility of my joining Netscape as a founder, but I eventually decided against it (they hadn't yet decided to do Web stuff when I talked with them). This is one of the biggest "what if" moments of my career. If I had gone to Netscape, I think there's a good chance that Tcl would have become the browser language instead of JavaScript and the world would be a different place! However, in retrospect I'm not sure that Tcl would actually be a better language for the Web than JavaScript, so maybe the right thing happened.

Definitely dodged a bullet there. Although on the other hand if it had been TCL there's pretty much zero chance people would have tolerated it like they have with Javascript so it might have been replaced with something better than both. Who knows...

[-] FizzyOrange@programming.dev 17 points 10 months ago

At the time it made way more sense because the "traditional" smart watches were way worse. Not even one day battery life. I would say Pebble still wins on size though - actually normal watch sized.

As for why they didn't catch on.. Probably a little bit ahead of their time, and also less shiny.

[-] FizzyOrange@programming.dev 16 points 11 months ago

Yep, whenever they fix a bug it's added in a new flag that nobody knows about.

git --enable-sane-behaviour
[-] FizzyOrange@programming.dev 16 points 1 year ago

Where's the code that doesn't quote this properly? I'm guessing it's Bash.

[-] FizzyOrange@programming.dev 17 points 1 year ago

Totally depends what you end up working on as a programmer. If it's web apps, you'll be totally fine. All you need is basic arithmetic. Writing a game engine? You'll need to know some basic to moderate matrix maths...

If you're doing formal verification using unbounded model checking... good fucking luck.

On average I would say most programming tasks need very little maths. If you can add and multiply you'll be fine. Definitely sounds like you'll be ok.

[-] FizzyOrange@programming.dev 16 points 1 year ago

I use this big expensive simulator called Questa, and if there's an error during the simulation it prints Errors: 1, Warnings: 0 and then exits with EXIT_SUCCESS (0)! I tried to convince them that this is wrong but they're like "but it successfully simulated the error". 🤦🏻‍♂️

We end up parsing the output which is very dumb but also seems to be industry standard in the silicon industry unfortunately (hardware people are not very good at software engineering).

[-] FizzyOrange@programming.dev 16 points 2 years ago

Yeah I kind of agree but I also think when it gets to that point we'll have much bigger problems than programmers losing their jobs. Like, most of society losing their jobs.

[-] FizzyOrange@programming.dev 16 points 2 years ago

Yeah I think it's trauma due to C/C++'s awful warning system, where you need a gazillion warnings for all the flaws in the language but because there are a gazillion of them and some are quite noisy and false positives prone, it's extremely common to ignore them. Even worse, even the deadly no-brainer ones (e.g. not returning something from a function that says it will) tend to be off by default, which means it is common to release code that triggers some warnings.

Finally C/C++ doesn't have a good packaging story so you'll pretty much always see warnings from third party code in your compilations, leading you to ignore warnings even more.

Based on that, it's very easy to see why the Go people said "no warnings!". An unused variable should definitely be at least a warning so they have no choice but to make it an error.

I think Rust has proven that it was the wrong decision though. When you have proper packaging support (as Go does), it's trivial to suppress warnings in third party code, and so people don't ignore warnings. Also it's a modern language so you don't need to warn for the mistakes the language made (like case fall through, octal literals) because hopefully you didn't make any (or at least as many).

[-] FizzyOrange@programming.dev 17 points 2 years ago

Are there any videos of this sort of editing, because honestly every single person I've watched use Vim has just been like "oh wait that's the wrong thing.. hold on." constantly. You're going to say "they aren't competent" but that's kind of the point - approximately nobody is competent in Vim because it isn't worth learning.

Even so, I'd be interested if there are any videos of pros doing real editing (not "look what I can do") on YouTube. Anyone know of any?

view more: ‹ prev next ›

FizzyOrange

joined 2 years ago