What storage expense? appimage are actually the smallest thanks to their compression.

I'm saying that Flatpaks use more storage for reliability, and that AppImages are less reliable because they rely on system dependencies in some circumstances.

but usually the issue is that you are missing a lib and not that the app itself is less reliable

This is why AppImages are less reliable. Flatpaks either work for everybody, or they don't work at all. AppImages might not work if you're on a "weird distro" or forgot to install something on your system.

And the support channel of yuzu in their discord was full of people having issues with the flatpak that were magically fixed the moment they tried the appimage, due to that issue with mesa being outdated in the flatpak.

Packaging your software with Flatpak does not mean you won't have issues. But when you do have issues, you know they'll be an issue for everybody. So when you fix it, you also fix it for everybody.

For example, the RetroArch package was using an old version of the Freedesktop Platform, which comes with an old version of Mesa. When they bumped the version (just changing it from 22.08 to 23.08), the problem was fixed: https://discourse.flathub.org/t/problems-with-mesa-drivers/5574/3

In this case, many of these dependencies are required for a lot of games to work properly in Wine. Dosbox is used as an emulation tool. I don't know of another package manager that doesn't give you an option to install all of the optional dependencies.

That seems like the wrong place to link to. Shouldn't you be linking to Sealed Sender?

This is good to know. I'm more into rolling releases like Arch, Fedora, and openSUSE anyway, so the latest Ubuntu's packages tend to be a bit old for me anyway.

The main package I was thinking of was the kernel. I saw the recent Linux Experiment video by Nick and they were using a kernel version (6.1?) that was no longer supported nor an LTS.

This was a use case I was introduced to directly before I discovered Firefox was introducing support for HEVC decoding.

I use HEVC because it has significantly better compression than older codecs, and many modern devices have hardware decoding support for HEVC.

If it weren't for iOS, VP9 could take its place, or so the Mozilla developers thought. HEVC and newer codecs like VP9, AV1, VVC, EVC, etc. offer better compression but often at the cost of compute. I imagine hardware decoding evens the scales a lot; I haven't done any benchmarking myself. I don't know how much impact the complexity of H.265 vs H.264 has on battery life, if any. Of course, hardware encoding on VP9 is not really a thing (AV1 is ahead of it, even), so HEVC has the edge there.

In a few years, AV1 hardware implementations will hopefully be ubiquitous; that would solve the efficiency and software patent problems at the same time. It'll probably coincide with the last of H.264's patents expiring. So on the one hand, I can understand why Mozilla is in no rush to support HEVC.

So I imagine you use a Chromium-based browser for Jellyfin?

I had some problems with alt tabbing in really old versions of Warcraft 3 (1.27 and older)

Have you tried Gamescope? I experienced similar issues with a lot of older visual novels. Gamescope was the cure-all for windowing issues like this. That said, if WineD3D works better than DXVK for the game, there's not much reason to look into it.

I'm familiar with the history of GNOME, and somewhat with Xamarin and Mono. While I have made that argument in the past, it was pointed out to me that the GNOME name was used to ride off the coattails of the popularity the GNU project had in the '90s, and they ended the association when it stopped being convenient for them.

(A GNOME developer pointed this out to me using this language; I could link you to the interaction, but it was on reddit)

I mean, both RHEL and Debian use Glibc which means the vast majority of the Linux applications running outside the cloud are calling into GNU code.

This also includes the proprietary NVIDIA driver, which only works with glibc.

Unlike GNU, his vision of the Linux desktop was populated by music players, spreadsheets, email / calendar programs, PDF viewers, and video editors.

I think this is a strange characterization of the GNU Project's goals. This is the Initial Announcement for the GNU Project:

To begin with, GNU will be a kernel plus all the utilities needed to write and run C programs: editor, shell, C compiler, linker, assembler, and a few other things. After this we will add a text formatter, a YACC, an Empire game, a spreadsheet, and hundreds of other things. We hope to supply, eventually, everything useful that normally comes with a Unix system, and anything else useful, including on-line and hardcopy documentation.

and eventually a Lisp-based window system through which several Lisp programs and ordinary Unix programs can share a screen.

Do you know something I don't? I don't think the GNU Project was against multimedia software; they were just focusing on the more fundamental stuff first.


The GNU Project's biggest contributions were when the kernel was in its infancy. The most major contribution is undoubtedly the GPL. Without it, Linux would not be where it is today. I think enough has been said on that subject, but it's what made RHEL billions. It's the philosophy of free software that has made so much of the programs today possible. It's incredibly important.

Obviously, we also have the GNU Project financially backing Debian GNU/Linux in its infancy. And while you say GNU wasn't involved in the GUI layer, that's not true. They worked on the free Harmony toolkit as a matter of high priority, and would have kept working on it if GNOME had not been so successful. Thanks to the success of another GNU project, GIMP, the GTK toolkit was able to be repurposed for general usage.

I don't think it's fair to discard contributions that never panned out like HURD and Harmony, because it shows GNU was actively involved in making the desktop better for everyone, which has really been its mission from the start. Maybe they're not "the backbone" of the desktop, but I think it's fair to say their biggest/most notable contributions have been to the desktop, not the server.

I don't contribute to the GNU Project because frankly, they don't do anything I consider worthwhile at the moment. I don't contribute to the Linux Foundation, either. I contribute to user-facing software I'm interested in, like Lutris, GIMP, and Kdenlive.

Ah, I see. I only used it once, so it's not something I do often, but it worked perfectly for me as a client to a Windows computer.

Seriously though, a stable API is not the GTK/GNOME developers’ agenda here. Nobody wanting a stable API should write software with this toolkit.

This blog post doesn't mention GTK, but I've heard GTK will sometimes implement breaking changes in minor version bumps. I was thinking about writing some software with GTK, and I haven't been deterred so I guess I'll learn the hard way, but has GTK 4 had any of these stability problems yet?

  • The package manager.
  • New releases make it to the repositories quickly.
  • The software is as vanilla as possible; no changes made by the distribution except to get it working.
  • The wiki.
  • +/- No nagging graphical updater.
  • +/- Users can share build scripts for building software from source very easily
  • +/- No particular stance on free software licenses.
view more: ‹ prev next ›

Spectacle8011

joined 1 year ago