[-] kogasa@programming.dev 4 points 3 months ago

ChatGPT uses auxiliary models to perform certain tasks like basic math and programming. Your explanation about plausibility is simply wrong.

[-] kogasa@programming.dev 4 points 4 months ago

You tell em Admiral

[-] kogasa@programming.dev 4 points 4 months ago

Nullable reference types are (a completely mandatory) bandaid fix in my opinion as a .net dev. You will encounter lots of edge cases where the compiler is unable to determine the nullability of an object, e.g. when using dependency injection to populate a field, or when using other unusual control flows like MediatR. You can suppress the warnings manually at the slight risk of lying to the analyzer. Objects supplied by external library code may or may not be annotated, and they may or may not be annotated correctly. The lack of compile-time null checking is occasionally an issue. But that said, NRT makes nullability a significantly smaller issue in C# than it used to be

[-] kogasa@programming.dev 4 points 4 months ago* (last edited 4 months ago)

"Measure" is meant in the specific sense of measure theory. The prototypical example is the Lebesgue measure, which generalizes the intuitive definition of length, area, volume, etc. to N-dimensional space.

As a pseudo definition, we may assume:

  1. The measure of a rectangle is its length times its width.

  2. The measure of the disjoint union of two sets is the sum of their measures.

In 2), we can relax the assumption that the two sets are disjoint slightly, as long as the overlap is small (e.g. two rectangles overlapping on an edge). This suggests a definition for the measure of any set: cover it with rectangles and sum their areas. For most sets, the cover will not be exact, i.e. some rectangles will lie partially outside the set, but these inexact covers can always be refined by subdividing the overhanging rectangles. The (Lebesgue) measure of a set is then defined as the greatest lower bound of all possible such approximations by rectangles.

There are 2 edge cases that quickly arise with this definition. One is the case of zero measure: naturally, a finite set of points has measure zero, since you can cover each point with a rectangle of arbitrarily small area, hence the greatest lower bound is 0. One can cover any countably infinite set with rectangles of area epsilon/n^(2) so that the sum can be made arbitrarily small, too. Even less intuitively, an uncountably infinite and topologically dense set of points can have measure 0 too, e.g. the Cantor set.

The other edge case is the unmeasurable set. Above, I mentioned a subdivision process and defined the measure as the limit of that process. I took for granted that the limit exists. Indeed, it is hard to imagine otherwise, and that is precisely because under reasonably intuitive axioms (ZF + dependent choice) it is consistent to assume the limit always exists. If you take the full axiom of choice, you may "construct" a counterexample, e.g. the Vitali set. The necessity of the axiom of choice in defining this set ensures that it is difficult to gain any geometric intuition about it. Suffice it to say that the set is both too "substantial" to have measure 0, yet too "fragmented" to have any positive measure, and is therefore not well behaved enough to have a measure at all.

[-] kogasa@programming.dev 4 points 8 months ago* (last edited 8 months ago)

78wpm 92% gboard

~200wpm on a physical desktop keyboard

[-] kogasa@programming.dev 4 points 10 months ago

No such thing has been "mathematically proven." The emergent behavior of ML models is their notable characteristic. The whole point is that their ability to do anything is emergent behavior.

[-] kogasa@programming.dev 4 points 10 months ago

I'd rather have an explicit time zone any time a datetime is being passed around code as a string. Communicating it to a human is relatively safe since even if there's a mistake, it's directly visible. Before that last step, incorrect time zone parsing or implicit time zone assumptions in code that was written by "who knows" in the year "who knows" can be really annoying.

[-] kogasa@programming.dev 4 points 1 year ago

Third category, software provided as part of an ancient service contract that nobody is allowed to touch, even though the service partner stopped offering support for this particular software years ago. Ask me how I know

[-] kogasa@programming.dev 4 points 1 year ago* (last edited 1 year ago)

It's obviously not "just" luck. We know LLMs learn a variety of semantic models of varying degrees of correctness. It's just that no individual (inner) model is really that great, and most of them are bad. LLMs aren't reliable or predictable (enough) to constitute a human-trustable source of information, but they're not pure gibberish generators.

[-] kogasa@programming.dev 4 points 1 year ago

Immutable in this context usually means the root filesystem is readonly at runtime, and all changes are performed by updating a set of declarative config files that describe the desired state of the system. Changes would be prepared and applied after a reboot. Something like that.

[-] kogasa@programming.dev 4 points 1 year ago

The RIF dev recently released an alpha for his Tildes app, Three Cheers for Tildes. It's in an earlier state than Sync but already quite usable.

[-] kogasa@programming.dev 4 points 1 year ago

The thing that's getting smaller is the "complexity" or "distance from the trivial case" of the function invocation. This is an informal notion though.

view more: ‹ prev next ›

kogasa

joined 1 year ago