[-] Kissaki@programming.dev 1 points 2 hours ago

there have been physical fights between committee members

lol; committee with consensus by violence?

[-] Kissaki@programming.dev 2 points 1 day ago* (last edited 1 day ago)

Doesn't help that it's a multi-page document

Persistent domain entity, Proto-persistent domain entity, View model, ,

What the heck… Yeah, I wouldn't want to use that either. While it may be a formalization, it seems like it would significantly increase complexity and overhead. That can't be worth it unless it's a huge enterprise system that has to work with generalized object types across teams or something.

I hadn't heard of Restful Objects before.

[-] Kissaki@programming.dev 1 points 1 day ago* (last edited 1 day ago)

I didn't quite follow.

They're using htmx, make errors, and learning something new about using it?

That's like using any new tech though, right? Or - depending on the devs - happens even with established tech.

I've never seen htmx in production. I find it interesting though and want to explore using it. That won't be at work though. :)

[-] Kissaki@programming.dev 1 points 1 day ago

I found the dropping of actions quite surprising as well. I would suspect we could return the links with a disabled attribute? If they should be displayed but not accessible/triggerable.

[-] Kissaki@programming.dev 1 points 1 day ago

We could have called them HTTP APIs.

[-] Kissaki@programming.dev 1 points 1 day ago

They're not demanding anything. They're describing how the current meaning of REST is nothing like the original one.

They're making a point for not splitting application state and logic into client and server with shared knowledge. If you're making that a pretext of course their argumentation won't fit. They're describing an alternative architecture and approach. Not an alternative protocol for the current common web application architectures.

[-] Kissaki@programming.dev 1 points 1 day ago

No, it doesn't mean only humans can interact with it.

The key point [of classical REST] is that responses are self-contained self-describing. Requesting a resource response tells you what actions you can take on it. There is no need for application domain knowledge, implicitly or separately-explicitly shared knowledge.

Some HTTP web apis offer links in their JSON responses for example. Like previous and next page/ref on paging/sectioning/cursor. Or links to other resources. I don't think I've ever seen possible resource actions/operations be included though. Which is what the original REST would demand.

That's how I understood it anyway.

Their suggestion of using HTML rather than JSON is mainly driven by their htmx approach, which the project and website is about. Throughout this article though, they always leave open which data form is actually used. In your quoted text they say "for example". In a later example, they show how JSON with hyperlinks could look like. (But then you need knowledge about that generalized meta structure.)

49
15
[-] Kissaki@programming.dev 3 points 2 days ago

I interpreted it as a wording issue on the "without downloading them" referring to you work locally - which matches "(client side)" as well.

[-] Kissaki@programming.dev 5 points 2 days ago

without downloading them (client side)

if they're client side they must be downloaded

[-] Kissaki@programming.dev 6 points 3 days ago

Presentation/Lecture; bad software quality due to software stack complexity with increased separation of layers and participants

SoC (System on a Chip) hardware for embedded/smaller use cases is very common and successful.

Suggests "Direct Coding" with direct hardware access as a possible alternative approach to PC hardware interfacing. Implementing that is more about commitment than difficulty. Depends more on hardware producers than software developers. A lack of drivers could give a fairer playing field between manufacturers.

[-] Kissaki@programming.dev 5 points 3 days ago* (last edited 3 days ago)

Seems like a Ruby issue and suggested improvement? Using keyword arguments does feel like introducing a type of typing.

In C# I use records for simple, naturally behaving types, I can define explicit and implicit cast operators, so I have to choice between requiring explicit casts or not (because they make sense to require or are not necessary). I can use var to define a variable without specifying a type, and it is deducted from what it gets assigned - but is still that specific type and gives me type safety.

In Rust, as far as I understand anyway, traits define shared behavior. In Go interface implementations are implicit rather than explicit. With these, there's even less of a need of elaborate explicit typing like the post argues/gives an example of.


In general, I've never had considerable effort or annoyance implementing or using typing. And I know what it's good for; explicitness, and in consequence, predictability, certainty, increased maintainability, and reduced issues and confusions. If following references or refactoring becomes unpredictable or high effort, it'd be quite annoying.

When I'm coding JavaScript adding JSDoc so the typing information gets passed along is quite cumbersome. Without it, the IDE does not give intellisense/auto-completion or argument type matching. JavaScript is better with it, I consider it worth it with IDE support, but it is quite cumbersome. (I try to evade TypeScript compiler/tooling overhead.)

A programming language can offer extensive auto-deduction while using strong typing. With appropriate conversions in place, it will only report conflicts and where it was intended to.


I'm thinking of where I enjoyed dynamic natures, which I certainly have. But I don't think that's a matter of typing. It's a matter of programming language interfacing to typing. If in PHP or JS I make a change, hit F5, and get an error, that's not any better than the IDE already showing it beforehand. And for the most part, I can program the same way with or without typing.

Man, this became a long text.

8
Announcing .NET 9 - .NET Blog (devblogs.microsoft.com)
submitted 1 week ago* (last edited 1 week ago) by Kissaki@programming.dev to c/programming@beehaw.org
21
Announcing .NET 9 - .NET Blog (devblogs.microsoft.com)
submitted 1 week ago* (last edited 1 week ago) by Kissaki@programming.dev to c/programming@programming.dev
90
submitted 1 month ago* (last edited 1 month ago) by Kissaki@programming.dev to c/programming@programming.dev

Today, we’re thrilled to announce Deno 2, which includes:

  • Backwards compatibility with Node.js and npm, allowing you to run existing Node applications seamlessly
  • Native support for package.json and node_modules
  • Package management with new deno install, deno add, and deno remove commands
  • A stabilized standard library
  • Support for private npm registries
  • Workspaces and monorepo support
  • Long Term Support (LTS) releases
  • JSR: a modern registry for sharing JavaScript libraries across runtimes

We are also continually improving many existing Deno features:

  • deno fmt can now format HTML, CSS, and YAML
  • deno lint now has Node specific rules and quick fixes
  • deno test now supports running tests written using node:test
  • deno task can now run package.json scripts
  • deno doc’s HTML output has improved design and better search
  • deno compile now supports code signing and icons on Windows
  • deno serve can run HTTP servers across multiple cores, in parallel
  • deno init can scaffold now scaffold libraries or servers
  • deno jupyter now supports outputting images, graphs, and HTML
  • deno bench supports critical sections for more precise measurements
  • deno coverage can now output reports in HTML

Deno is a single binary for the TypeScript and JavaScript ecosystems. Deno is secure by default (installing npm libs do not automatically have full system perms/access).

The new standard library stabilizes a vetted collection of safe binaries instead of having to search for and install random libraries for basic or common use cases with [or without] own security assessments.

Deno compile compiles the TS/JS project into a single binary.

The backwards compatibility to npm and npm/js frameworks enables deno usage in existing projects and with existing libs with the benefits of deno and a path to incremental migration.

The announcement video is worth watching. The intro is great.

195
submitted 1 month ago* (last edited 1 month ago) by Kissaki@programming.dev to c/programming@programming.dev

Every second Tuesday of October Ada Lovelace Day is celebrated - to commemorate the famous English mathematician of the XIX century, and the first programmer in history.

To mark this occasion, we rounded up a party of games that are not only fun to play, but can teach you to think like a true engineer and feel like a tech wizard!

Welcome to Ada Lovelace Day Sale. Hello, world!

ends 14th (tomorrow)

337

researchers conducted experimental surveys with more than 1,000 adults in the U.S. to evaluate the relationship between AI disclosure and consumer behavior

The findings consistently showed products described as using artificial intelligence were less popular

“When AI is mentioned, it tends to lower emotional trust, which in turn decreases purchase intentions,”

29
19

Mapping C# array types to PostgreSQL array columns or other DBMS/DB JSON columns.

13

UI Components: Smart Paste, Smart TextArea, Smart ComboBox

Dependency: Azure Cloud

They show an interesting new kind of interactivity. (Not that I, personally, would ever use Azure Cloud for that though.)

113

There's a lot, and specifically a lot of machine learning talk and features in the 1.5 release of Opus - the free and open audio codec.

Audible and continuous (albeit jittery) talk on 90% packet loss is crazy.

Section WebRTC IntegrationSamples has an example where you can test out the 90 % packet loss audio.

view more: next ›

Kissaki

joined 1 year ago