[-] Maxy@lemmy.blahaj.zone 2 points 2 months ago

It depends what you’re optimising for. If you want a single (relatively small) download to be available on your HDD as fast as possible, then your current setup might be better (optimising for lower latency). However, if you want to be maxing out your internet speeds at all time and increase your HDD speeds by making the copy sequential (optimising for throughput), then the setup with the catch drive will be better. Keep in mind that a HDD’s sequential write performance is significantly higher than its random write performance, so copying a large file in one go will be faster than copying a whole bunch of random chunks in a random order (like torrents do). You can check the difference for yourself by doing a disk benchmark and comparing the sequential vs random writes of your drive.

[-] Maxy@lemmy.blahaj.zone 4 points 5 months ago

YouTube would be smart enough not to advertise Adobe creative cloud in the pre-roll ads of this video, right? Right???

[-] Maxy@lemmy.blahaj.zone 3 points 5 months ago

Have you tried the official guide from the jellyfin website?

As for the guide this AI generated: it bothers me that they instruct you to use chocolatey for the *arrs, but still advice you to install docker, qbittorrent and jellyfin manually (all of which have chocolatey packages). I disagree with the comment that external storage would be recommended, as internal storage is generally more reliable (depending on a lot of factors of course). Also, I believe the "adding a library"-section of the jellyfin setup is a bit too short to be of any use, and would recommend referring to the jellyfin docs instead.

This guide also doesn't explain how to make jellyfin accessible outside of your LAN. Once again, I'd recommend referring to the jellyfin docs if you want to do this.

I personally have only set up qbittorrent, jellyfin and docker (not the *arr suite), so I can't comment on the completeness of the guide, but I wouldn't trust it too much (seeing the previous oversights).

And finally, as someone who started their selfhosted server journey on windows: don't. There is a reason why almost all guides are written for linux, as it is (in my humble opinion) vastly superior for server usage once you get used to it.

[-] Maxy@lemmy.blahaj.zone 4 points 8 months ago

Not OP (OC? Not the person you were helping, you get what I mean), are you sure you meant df -h? fd -H seems more useful for to me when trying to find a specific file in a dotfolder, though even that didn't work on my system. fd ignores ~/.config by default, so you need to use fd -u (which is an alias for fd -I -H) to find the correct files.

Anyways, from your description it seems like the correct file would be ~/.config/kwinrc, which exists on my system.

[-] Maxy@lemmy.blahaj.zone 4 points 10 months ago

You could look at the awesome-selfhosted list, specifically these two sections:

https://awesome-selfhosted.net/tags/recipe-management.html

https://awesome-selfhosted.net/tags/task-management--to-do-lists.html

I don’t have any experience with any of those, but there might be something that fits your needs.

[-] Maxy@lemmy.blahaj.zone 5 points 10 months ago

Ah, it looks like we have a small misunderstanding. I thought you were talking about uncompressed video, which is enormous. This is only used in HDMI cables for example. A 1080p60 uncompressed video is 2.98Gbit/s, or about 1.22 terabytes per hour.

A remux is “uncompressed” in the sense that it isn’t recompressed, or in this case transcoded. A remux is still compressed, just to a lesser degree than a transcode. This means the files are indeed larger, but the quality is also better than transcodes.

To clarify the article’s confusing statement: they claim that remuxes can reduce size by throwing away some audio streams, while keeping the original video. This is true, but the video itself hasn’t gotten any smaller: you are simply throwing away other information.

[-] Maxy@lemmy.blahaj.zone 3 points 10 months ago

Remuxes aren’t uncompressed, nor are they losslessly compressed. They’re just a 1:1 direct copy from some other medium (generally blu-rays or DVD’s).

[-] Maxy@lemmy.blahaj.zone 6 points 10 months ago

Disclaimer: I have exactly 0 personal experience with eGPU’s.

According to the archwiki:

While some manual configuration (shown below) is needed for most modes of operation, Linux support for eGPUs is generally good.

[-] Maxy@lemmy.blahaj.zone 3 points 1 year ago

Some context for the last tldr’d paragraph:

The negative Wall Street sentiments about the global economy have been echoed elsewhere. Last week, the Economist published a leader article entitled: “The world economy is defying gravity. That cannot last.

[-] Maxy@lemmy.blahaj.zone 3 points 1 year ago

Just out of curiosity, are you sure “fd” is the right command in the “format storage” section? I don’t have a rasbian system to test this on, but on my arch system, “df” is used to list disks; “fd” is a multithreaded version of “find”, which I manually installed.

[-] Maxy@lemmy.blahaj.zone 3 points 1 year ago

It depends on your linker. By default, Firefox appears to use the LLD linker. There is a faster one available, which runs perfectly fine on my 16GB machine: https://github.com/rui314/mold. After installing, it can be enabled by setting —enable-linker=mold instead of —enable-linker=lld

[-] Maxy@lemmy.blahaj.zone 3 points 1 year ago

I personally use the “slav art” discord bot. It lets you paste links from Spotify, deezer, qobuz, tidal and a few more. Some of the music providers are down from time to time, and the server gets nuked by the mods sometimes, but when it works, it’s great.

Link to wiki page (2nd option in the wiki): https://fmhy.pages.dev/audiopiracyguide/#download-apps

Direct link: https://discord [dot] gg/NgPJTt3WK3

PS: Reddit migrator here, is it still necessary to change the links and avoid directly linking?

view more: ‹ prev next ›

Maxy

joined 1 year ago