[-] FizzyOrange@programming.dev 7 points 1 week ago

Krita is quite far ahead of GIMP at this point. I'm not a pro Photoshop user but if you are and you're looking at alternatives, that's the place to look.

[-] FizzyOrange@programming.dev 7 points 1 week ago

I don't know why you're being downvoted. It literally starts with the word OPINION in bold red caps.

[-] FizzyOrange@programming.dev 7 points 3 weeks ago

They seem exactly the same to me: when a variable is assigned a value, it’s equal to that value now.

Yeah it's confusing because in maths they are the same and use the same symbol but they are 100% not the same in programming, yet they confusingly used the same symbol. In fact they even used the mathematical equality symbol (=) for the thing that is least like equality (i.e. assignment).

To be fair not all languages made that mistake. There are a fair few where assignment is like

x := 20

Or

x <- 20

which is probably the most logical option because it really conveys the "store 20 in x" meaning.

Anyway on to your actual question... They definitely aren't the same in programming. Probably the simplest way to think of it is that assignment is a command: make these things equal! and equality is a question: are these things equal?

So for example equality will never mutate it's arguments. x == y will never change x or y because you're just asking "are they equal?". The value of that equality expression is a bool (true or false) so you can do something like:

a = (x == y)

x == y asks if they are equal and becomes a bool with the answer, and then the = stores that answer inside a.

In contrast = always mutates something. You can do this:

a = 3
a = 4
print(a)

And it will print 4. If you do this:

a = 3
a == 4
print(a)

It will (if the language doesn't complain at you for this mistake) print 3 because the == doesn't actually change a.

[-] FizzyOrange@programming.dev 7 points 1 month ago

This is about Spectre, not about buggy hardware implementations.

Spectre is a fundamental flaw in speculative execution that means it can leak information, so it's a security vulnerability. Apparently Intel has been imposing draconian requirements on software to work around the issue rather than fixing it in hardware, which is obviously what they should do, but is not at all trivial.

[-] FizzyOrange@programming.dev 7 points 1 month ago

Kind of worthless video. Just speculates about how it works. They don't ever even show the app working!

These glasses aren't even AR so the idea that they can overlay details as shown in the mockups is impossible.

[-] FizzyOrange@programming.dev 7 points 2 months ago

What language are your apps written in? Generally the best options are:

  1. Qt (C++) or PyQt (Python wrapper if you hate yourself). Old school desktop GUI. Works extremely well though.
  2. Web based, then you can pick from a gazillion frameworks, most popular is React. You generally have a Typescript based frontend and a backend in whatever language you want. The downside is you have to deal with the frontend/backend communication which can be a pain.

There's also Flutter which is pretty nice, but again you have to use Dart for the GUI so if the rest of your app is in another language you'll have some friction.

But yeah, I would say the language you want to write your "business logic" in is the biggest factor to choosing. Also if you care about exposing your app over the web.

[-] FizzyOrange@programming.dev 7 points 2 months ago

That appears to not support comments. How they made that mistake after JSON is a mystery.

[-] FizzyOrange@programming.dev 7 points 2 months ago

What you mean you can't easily tell what this is?

- foo:
  -
  - :
    - bar:
      baz: [
        - -
      ]
[-] FizzyOrange@programming.dev 7 points 2 months ago

Well, in fairness imagine if Guido did become a racist sexist arsehole. I don't think he should be immune.

But clearly this situation is not right.

[-] FizzyOrange@programming.dev 7 points 3 months ago

Would be cool to have more people on Linux finding and fixing these little details.

Unlikely to happen. This is very complicated low level stuff that's often completely undocumented. Often the hardware is buggy but it works with Windows/Mac because that's what it's been tested with, so you're not even implementing a spec, you're implementing Windows' implementation.

Also the few people that have the knowledge to do this a) don't want to spend a ton of money buying every model of monitor or whatever for testing, and b) don't want to spend all their time doing boring difficult debugging.

I actually speak from experience here. I wrote a semi-popular FOSS program for a type of peripheral. Actually it only supports devices from a single company, but... I have one now. It cost about £200. The other models are more expensive and I'm not going to spend like £3k buying all the other models so I can test it properly. The protocol is reverse engineered too so.. yeah I'll probably break it for other people, sorry.

This sort of thing really only works commercially IMO. It's too expensive, boring and time consuming for the scratch-an-itch developers.

[-] FizzyOrange@programming.dev 7 points 3 months ago

OMG they finally noticed how bad the REPL is. It's it going to let you paste indented code now?

[-] FizzyOrange@programming.dev 7 points 7 months ago

Yeah I think that's what he meant. You don't want CI editing commits.

I use pre-commit for this. It's pretty decent. The major flaws I've found with it:

  • Each linter has to be in its own repo (for most linter types). So it's not really usable for project-specific lints.

  • Doesn't really work with e.g. pyright or pylint unless you use no third party dependencies because you need a venv set up with your dependencies installed and pre-commit (fairly reasonably) doesn't take care of that.

Overall it's good, with some flaws, but there's nothing better available so you should definitely use it.

view more: ‹ prev next ›

FizzyOrange

joined 1 year ago