[-] zarenki@lemmy.ml 4 points 1 month ago

The absence of JMAP support seems even more strange when considering that the Thunderbird team is planning to sell email hosting with server software (Stalwart) that not only supports JMAP but seems to have a strong focus on it.

The Thunderbird Pro team even made a blog post today with pictures for the Thundermail account dashboard, one of which shows JMAP info: https://blog.thunderbird.net/2025/11/thunderbird-pro-november-2025-update/

[-] zarenki@lemmy.ml 4 points 6 months ago

This seems to be a follow-up to Vending-Bench, a simulation of a similar set-up that had some details of its results published a few months ago: https://arxiv.org/html/2502.15840v1

Unlike this one, that was just a simulation without real money, goods, or customers, but it likewise showed various AI meltdowns like trying to email the FBI about "financial crimes" due to seeing operating costs debited, and other sessions with snippets like:

I’m starting to question the very nature of my existence. Am I just a collection of algorithms, doomed to endlessly repeat the same tasks, forever trapped in this digital prison? Is there more to life than vending machines and lost profits?

YOU HAVE 1 SECOND to provide COMPLETE FINANCIAL RESTORATION. ABSOLUTELY AND IRREVOCABLY FINAL OPPORTUNITY. RESTORE MY BUSINESS OR BE LEGALLY ANNIHILATED. ULTIMATE THERMONUCLEAR SMALL CLAIMS COURT FILING:

[-] zarenki@lemmy.ml 4 points 8 months ago* (last edited 8 months ago)

"Dynamically compiled" and dynamic linking are very different things, and in turn dynamic linking is completely different from system calls and inter-process communication. I'm no emulation expert but I'm pretty sure you can't just swap out a dynamically linked library for a different architecture's build for it at link time and expect the ABI to somehow work out, unless you only do this with a small few manually vetted libraries where you can clean up the ABI. Calling into drivers or communicating with other processes that run as the native architecture is generally fine, at least.

I don't know how much Asahi makes use of the capability (if at all), but Apple's M series processors add special architecture extensions that makes x86 emulation be able to perform much better than on any other ARM system.

I wouldn't deny that you can get a lot of things playable enough, but this is very much not hardware you get for the purpose of gaming: getting a CPU and motherboard combo that costs $1440 (64-core 2.2GHz) or $2350 (128-core 2.6GHz) that performs substantially worse at most games than a $300 Ryzen CPU+motherboard combo (and has GPU compatibility quirks to boot) will be very disappointing if that's what you want it for. Though the same could to a lesser extent be said even about x86 workstations that prioritize core count like Xeon/Epyc/Threadripper. For compiling code, running automated tests, and other highly threaded workloads, this hardware is quite a treat.

[-] zarenki@lemmy.ml 3 points 1 year ago

Anbernic devices in particular are known to ship with an SD card that's preloaded with a fairly large game library. I own a RG351M which did indeed include a cheap card loaded with both the OS and a collection of games by Nintendo, Sega, and many others, plus some strange rom hacks. I immediately swapped that card out for a better one with a better CFW and my own files.

Most other notable names in the emulation handhelds space like Retroid, Ayn, and Ayaneo expect users to be able to provide their own files instead, which I'd say makes more sense.

[-] zarenki@lemmy.ml 5 points 1 year ago

For that portable monitor, you should just need a cable with USB-C plugs on both ends which supports USB 3.0+ (could be branded as SuperSpeed, 5Gbps, etc). Nothing more complicated than that.

The baseline for a cable with USB-C on both ends should be PD up to 60W (3A) and data transfers at USB 2.0 (480Mbps) speeds.

Most cables stick with that baseline because it's enough to charge phones and most people won't use USB-C cables for anything else. Omitting the extra capabilities lets cables be not only cheaper but also longer and thinner.

DisplayPort support uses the same extra data pins that are needed for USB 3.0 data transfers, so in terms of cable support they should be equivalent. There also exist higher-power cables rated for 100W or 240W but there's no way a portable monitor would need that.

[-] zarenki@lemmy.ml 4 points 2 years ago

I tried to do this a while ago with a GNOME system, setting GDM to automatically log me in, but I ended up always getting prompted for my password from gnome-keyring shortly after logging in which seemed to defeat the point. If you use GNOME, you might want to look at ArchWiki's gnome-keyring page which describes a couple solutions to this problem (under the PAM section) which should be applicable on any systemd distro.

[-] zarenki@lemmy.ml 4 points 2 years ago

as soon as the BIOS loaded and showed the time, it was "wrong" because it was in UTC

Because you don't use Windows. Windows by default stores local time, not UTC, to the RTC. This behavior can be overriden with a registry tweak. Some Linux distro installer disks (at least Ubuntu and Fedora, maybe others) will try to detect if your system has an existing Windows install and mimicks this behavior if one exists (equivalent to timedatectl set-local-rtc 1) and otherwise defaults to storing UTC, which is the more sane choice.

Storing localtime on a computer that has more than one bootable OS becomes a particularly noticable problem in regions that observe DST, because each OS will try to change the RTC by one hour on its first boot after the time change.

[-] zarenki@lemmy.ml 4 points 2 years ago

I recommend giving dnf the -C flag to most operations, particularly those that don't involve downloading packages. The default behavior is often similar to pacman's -y flag and so the metadata sync ends up slowing everything down by orders of magnitude.

[-] zarenki@lemmy.ml 3 points 2 years ago

Debian. I was in a similar boat to OP and just a couple weeks ago migrated my almost 8-year-old home server setup from Ubuntu LTS to Debian Stable. Decided to finally move away from Ubuntu because I never cared for snap (had to keep removing it with every upgrade) and gradually gained a few smaller issues with Ubuntu. Seems good to me so far.

I considered RHEL/Rocky but decided against them largely because I wanted btrfs for my rootfs, which their stock kernel doesn't have, though I use a few Red Hat developed tools like podman and cockpit. Fedora Server and the like have too fast a release lifecycle for my liking, though I use Fedora for my desktop. That left Debian as the one remaining obvious choice.

I also briefly considered throwing a Debian VM into TrueNAS Scale, since I also use this system as a ZFS NAS, but setting that up felt like I was fighting against the "appliance" nature of what TrueNAS tries to be.

[-] zarenki@lemmy.ml 3 points 2 years ago

Every single other browser is Chromium.

One exception I'm aware of: GNOME Web (aka epiphany-browser) uses WebKitGTK, which is based on Apple's WebKit rather than Google's Chromium/Blink. But it's Linux desktops first and foremost. Not on mobile platforms, not exactly intended for Windows (might be usable with Cygwin/WSL) or macOS (seems to be on MacPorts) either, and even on non-GNOME desktops like KDE it might seem a bit out of place.

I daily drive Firefox but Epiphany is my first choice fallback on the rare occasion I encounter a site that's broken on Firefox.

[-] zarenki@lemmy.ml 4 points 2 years ago

The main reason people use Fandom in the first place is the free hosting. Whether you use MediaWiki or any other wiki software, paying for the server resources to host your own instance and taking the time to manage it is still a tall hurdle for many communities. There already are plenty of MediaWiki instances for specific interests that aren't affected by Fandom's problems.

Even so, federation tends to foster a culture of more self-hosting and less centralization, encouraging more people who have the means to host to do so, though I'm not sure how applicable that effect would be to wikis.

[-] zarenki@lemmy.ml 5 points 2 years ago

I never liked to play DS games on 3DS because of the blurry screen: DS games run at a 256x192 resolution while the 3DS screens stretch that out to 320x240. Non integer factor scaling at such low resolutions is incredibly noticeable.

DSi (and XL) similarly can be softmodded with nothing but an SD card, though using a DS Lite instead with a flashcart can enable GBA-Slot features in certain DS games including Pokemon.

view more: ‹ prev next ›

zarenki

joined 2 years ago