[-] Shimitar@feddit.it 3 points 1 day ago

Well, you need to set the specific ffmpeg flags as stated in the readme... If you already know them, its easy.

[-] Shimitar@feddit.it 1 points 2 days ago

Glad I helped!

1

(crossposting, because i think it can be interesting here too)

I like my video collection nicely organized and omogeneous as formats, codecs and resolution.

I know there are already pleanty of solutions like Tidarr, but i find them a bit too much for my needs, and also pretty complex to use.

I wrote a simpler approach in a bash script called "Media Fixer", the URL is at the top of this post.

Feel free to check it out, play with it. I hope it can be useful for others as well as me.

It's released under the GPLv3.

38

Hi all fellow one-eyed and one-legged rum-gagglers sailors!

I like my ISO collection nicely organized and omogeneous as formats, codecs and resolution.

I know there are already pleanty of solutions like Tidarr, but i find them a bit too much for my needs, and also pretty complex to use.

I wrote a simpler approach in a bash script called "Media Fixer", the URL is at the top of this post.

Feel free to check it out, play with it. I hope it can be useful for others as well as me.

It's released under the GPLv3.

[-] Shimitar@feddit.it 1 points 3 days ago* (last edited 3 days ago)

IMU, GPU encoding is for streaming: it aims at fast, not so great, output quality without CPU usage. Exactly what you are getting.

Don't use GPU encoding for storage... CPU encoding is much better.

Edit: since its aimed at streaming, GPU encoding only needs to achieve real time performance, no need to go any faster. CPU encoding instead can go as fast as your cores can push.

[-] Shimitar@feddit.it 1 points 6 days ago

Than you for your very appreciated feedback. Feel free to DM/PM me for anything related.

[-] Shimitar@feddit.it 5 points 6 days ago

Nothing wrong with that!

Just saying that not necessarily everything should be about money.

[-] Shimitar@feddit.it 3 points 6 days ago

Also, knowledge and sharing has been critical for advancement of human civilization. Imagine if scientists where to sell their research instead of publishing it(*) where would be today?

  • = I mean, you might have to pay to read those publications, but they are literally free and can ask the authors for a copy free in most cases....
[-] Shimitar@feddit.it 2 points 6 days ago

I keep a wiki on all that I do.

This is the page on radicale: here

This is the more general page on reverse proxy here

And so on, Check the sidebar.

I mostly write it so that in future I remember what I did and how I did it, but I use some unusual techniques compared to the mainstream point of view from this community, so keep that in mind.

79

Well, here my story, might it be useful to others too.

I have a home server with 6Tb RAID1 (os on dedicated nvme). I was playing with bios update and adding more RAM, and out of the blue after the last reboot my RAID was somehow shutdown unclean and needed a fix. I probably unplugged the power chord too soon while the system was shutting down containers.

Well, no biggie, I just run fsck and mount it, so there it goes: "mkfs.ext4 /dev/md0"

Then hit "y" quickly when it said "the partition contains an ext4 signature blah blah" I was in a hurry so...

Guess what? And read again that command carefully.

Too late, I hit Ctrl+c but already too late. I could recover some of the files but many where corrupted anyway.

Lucky for me, I had been able to recover 85% of everything from my backups (restic+backrest to the rescue!) Recreate the remaining 5% (mostly docker compose files located in the odd non backupped folders) and recovered the last 10% from the old 4Tb I replaced to increase space some time ago. Luckly, that was never changing old personal stuff that I would have regret losing, but didn't consider critical enough on backup.

The cold shivers I had before i checked my restic backup discovering that I didn't actually postponed backup of those additional folders...

Today I will add another layer of backup in the form of an external USB drive to store never-changing data like... My ISOs...

This is my backup strategy up to yesterday, I have backrest automating restic:

  • 1 local backup of the important stuff (personal data mostly)
  • 1 second copy of the important stuff on an USB drive connected to an openwrt router on the other side of home
  • 1 third copy of the important stuff on a remote VPS

And since this morning I have added:

  • a few git repos (pushed and backup in the important stuff) with all docker compose, keys and such (the 5%)
  • an additional USB local drive where I will be backup ALL files, even that 10% which never changes and its not "important" but I would miss if I lost it.

Tools like restic and Borg and so critical that you will regret not having had them sooner.

Setup your backups like yesterday. If you didn't already, do it now.

34

In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.

I don't use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.

So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?

And as power consumption, will the increase be noticeable? Should I do it or pass?

[-] Shimitar@feddit.it 52 points 2 months ago

Podman guys... Podman All the way...

[-] Shimitar@feddit.it 71 points 2 months ago* (last edited 2 months ago)

There is no "write and forget" solution. There never has been.

Do you think we have ORIGINALS or Greek or roman written texts? No, we have only those that have been copied over and over in the course of the centuries. Historians knows too well. And 90% of anything ever written by humans in all history has been lost, all that was written on more durable media than ours.

The future will hold only those memories of us that our descendants will take the time to copy over and over. Nothing that we will do today to preserve our media will last 1000 years in any case.

(Will we as a specie survive 1000 more years?)

Still, it our duty to preserve for the future as much as we can. If today's historians are any guide, the most important bits will be those less valuable today: the ones nobody will care to actually preserve.

Citing Alessandro Barbero, a top notch Italian current historian, he would kill no know what a common passant had for breakfast in the tenth century. We know nothing about that, while we know a tiny little more about kings.

25
Web printing (feddit.it)
submitted 2 months ago by Shimitar@feddit.it to c/selfhosted@lemmy.world

Hi!

I have setup ScanServJS which is an awesome web page that access your scanner and let you scan and download the scanned pages from your self hosted web server. I have the scanner configured via sane locally on the server and now I can scan via web from whatever device (phone, laptop, tablet, whatever) with the same consistent web interface for everyone. No need to configure drivers anywhere else.

I want to do the same with printing. On my server, the printer is already configured using CUPS, and I can print from Linux laptops via shared cups printer. But that require a setup anyway, and while I could make it work for phones and tablets, I want to avoid that

I would like to setup a nice web page, like for the scanner, where the users no matter the device they use, can upload files and print them. Without installing nor configuring anything on their devices.

Is there anything that I can self-host to this end?

42
submitted 2 months ago* (last edited 2 months ago) by Shimitar@feddit.it to c/selfhosted@lemmy.world

Hi fellow hosters!

I do selfhost lots of stuff, starting from the classical '*Arrs all the way to SilberBullet and photos services.

I even have two ISPs at home to manage failover in case one goes down, in fact I do rely on my home services a lot specially when I am not at home.

The main server is a powerful but older laptop to which i have recently replaced the battery because of its age, but my storage is composed of two raid arrays, which are of course external jbods, and with external power supplies.

A few years ago I purchased a cheap UPS, basically this one: EPYC® TETRYS - UPS https://amzn.eu/d/iTYYNsc

Which works just fine and can sustain the two raids for long enough until any small power outage is gone.

The downside is that the battery itself degrades quickly and every one or two years top it needs to be replaced, which is not only a cost but also an inconvenience because i usually find out always the worst possible time (power outage), of course!

How do you tackle the issue in your setups?

I need to mention that I live in the countryside. Power outages are like once or twice per year, so not big deal, just annoying.

10
submitted 3 months ago by Shimitar@feddit.it to c/selfhosted@lemmy.world

I have a home network with an internal DNS resolver. I have some subdomains (public) that maps to a real world IP address, and maps to the home server private address when inside home.

In short, i use unbound and have added some local-data entries so that when at home, those subdomains points to 192.168.x.y instead.

All works perfectly fine from Windows and from Linux PCs.

Android, instead, doesnt work.

With dynamic DHCP allocation on android, the names cannot be resolved (ping will fail...) from the android devices. With specific global DNS servers (like dns.adguard.com) of course will always resolve to the public IP.

The only solution i found is to disable DHCP for the Wifi on android and set a static IP with the 192.168.x.y as DNS server, in this case it will work.

But why? Aynbody has any hints?

It's like Android has some kind of DNS binding protection enabled by default, but i cannot find any information at all.

48
submitted 4 months ago by Shimitar@feddit.it to c/piracy@lemmy.dbzer0.com

As the title goes, is there a way to download content from amazon prime video?

Like yt-dl or similar...

22
DNS issues (feddit.it)
submitted 5 months ago* (last edited 3 months ago) by Shimitar@feddit.it to c/selfhosted@lemmy.world

Hi! i am selfhosting my services and using a DNSMasq setup to provide ad-blocking to my home network.

I was thinkering with Unbound to add a fully independent DNS resolver and not depend on Google/Adblock/Whatever upstream DNS server but i am unable to make Unbound work.

Top Level Domains (like com, org...) are resolved fine, but anything at second level doesn't. I am using "dig" (of course i am on linux) and Unbound logging to find out what's going on, but i am at a loss.

Could be my ISP blocking my requests? If i switch back to google DNS (for example) all works fine, but using my Unbound will only resolve TLDs and some random names. For example, it will resolve google.com but not kde.org...

Edit: somehow fixed by nuking config file and starting over.

39
FitTrackee (feddit.it)
submitted 5 months ago by Shimitar@feddit.it to c/selfhosted@lemmy.world

If I remember correctly, FitTrackee Dev do post on this community.

Well, I want to thank him/her as this is a very nice piece of software that I just started using but looks so promising and well done! A breeze to install, even on bare metal, and so well designed (even a CLI? Come on!).

Looking forward to try Garmin integration tomorrow.

Thank buddy!/Appreciated.

76
submitted 5 months ago* (last edited 5 months ago) by Shimitar@feddit.it to c/selfhosted@lemmy.world

Looking for a self hosted diary type of service. Where I can login and write small topics, ideas, tag them and date them. No need for public access.

Any recommendations?

Edit: anybody using monicahq or has experience with it?

Clarification: indeed I could use a general note taking app for this task. I already host and use silverbullet for general notes and such. I am looking at something more focused on daily events and connections. Like noting people met, sport activities and feedbacks, names, places... So tagging and date would be central, but as well as connections to calendar and contacts, and who knows what else... So I want to explore existing more advanced, more specialized apps.

Edit2: I ended up with BookStack. MonicaHQ seems very nice but proved unable to install using containers. It would not obey APP_URL properly and would mess up constantly HTTP / HTTPS redirection. Community was unrepsonsive and apparently github issues are ignore lately. So i ditched MonicaHQ and switched to BookStack: installed in a breeze (again container) and a very simple NGINX setup just worked. I will be testing it out now.

26
CalDAV web gui (feddit.it)
submitted 7 months ago by Shimitar@feddit.it to c/selfhosted@lemmy.world

Hi, Using radicale since I switched from next cloud, using dav5x on android pretty nicely.

I was thinking about adding a web ui to access my calendars too from web... Any recommendations?

Radicale web ui only manages accounts and stuff, not the calendars contents.

[-] Shimitar@feddit.it 62 points 9 months ago

Go AV1... In my direct experience the space saving is simply amazing at the same quality.

265 doesn't seems to be the future since all Android are going to support AV1 by mandatory from A14.

[-] Shimitar@feddit.it 61 points 9 months ago

I answer for myself. On linux the neat tool called "mediainfo" will print MKVs ,metadata, and that includes the real ISO title.

[-] Shimitar@feddit.it 40 points 1 year ago

I wouldn't dream to use any stock android at this point. Been on LOS forever and each new phone I buy either check if Los is available or, in one case (my current phone) I ported Los for it myself.

view more: next ›

Shimitar

joined 2 years ago