[-] Lem453@lemmy.ca 1 points 2 days ago

Would appreciate a link please! Can't seem to find it on DDG

[-] Lem453@lemmy.ca 9 points 2 days ago

You don't have the knowledge or experience to do this for a business. This is different than a personal cloud. You will be blamed when things don't work.

Don't touch this with a 10ft pole.

If you want to help, find commercial services that offer this and suggest those.

[-] Lem453@lemmy.ca 0 points 6 days ago

I've been using Firefox install via obtainium straight from the Mozilla repo.

[-] Lem453@lemmy.ca 45 points 1 month ago* (last edited 1 month ago)

The comments here saying to not bother with 10gbe is surprising considering it's the selfhosted community, not a random home networking self help. Dismissing a reasonable request form someone who is building a homelab is not a good way to grow niche communities like this one on the fediverse.

10gbe has come down in price a lot recently but is still more expensive than 1gb of course.

Ideas for switches: https://www.servethehome.com/the-ultimate-cheap-10gbe-switch-buyers-guide-netgear-ubiquiti-qnap-mikrotik-qct/

https://www.servethehome.com/nicgiga-s25-0501-m-managed-switch-review-5-port-2-5gbe-and-sfp-realtek/

For a router: https://www.servethehome.com/everything-homelab-node-goes-1u-rackmount-qotom-intel-review/

[-] Lem453@lemmy.ca 49 points 4 months ago

Use aegis, export the keys and then reimport them every time you switch. Trusting your second factor to a cloud is a disaster waiting to happen.

If you want to get fancy setup your own cloud server (nextcloud, Seafile, owncloud etc) and set the backup folder for aegis to the self hosted cloud for easy restore every time you switch ROMs.

141
submitted 4 months ago* (last edited 4 months ago) by Lem453@lemmy.ca to c/selfhosted@lemmy.world

The topic of self-hosted cloud software comes up often but I haven't seen anyone mention owncloud infinite scale (the rewrite in Go).

I started my cloud experience with owncloud years ago. Then there was a schism and almost all the active devs left for the nextcloud fork.

I used nextcloud from it's inception until last year but like many others it always felt brittle (easy to break something) and half baked (features always seemed to be at 75% of what you want).

As a result I decided to go with Seafile and stick to the Unix philosophy. Get an app that does one thing very well rather than a mega app that tries to do everything.

Seafile does this very well. Super fast, works with single sign on etc. No bloat etc.

Then just the other day I discovered that owncloud has a full rewrite. No php, no Apache etc. Check the github, multiple active devs with lots of activity over the last year etc. The project seems stronger than ever and aims to fix the primary issues of nextcloud/owncloud PHP. Also designed for cloud deployment so works well with docker, should be easy to configure via docker variables instead of config files mapped into the container etc.

Anyways, the point of this thread is:

  1. If you never heard of it like me then check it out
  2. If you have used it please post your experiences compared to NextCloud, Seafile etc.
[-] Lem453@lemmy.ca 94 points 4 months ago* (last edited 4 months ago)

This right here. I tried to join Mastodon today.

Download the most recommended app, Moshidon

Open app and get asked which instance i want to join. There are no suggestions.

Do a search for instances and pick one, go to the website and register with email and password. Requires email confirmation. Still waiting on the email confirmation link, 4 hrs later and 2 resends.

Literally haven't been able to sign up yet.

Even if it had worked, the workflow would have been to change back to the app, type out the instance then re-login.

I'm not sure how anyone expects anyone other than the most hardcore to sign up for these services. Maybe that's the point but if the point is to grow the user sign up process to significant overall

[-] Lem453@lemmy.ca 34 points 5 months ago* (last edited 5 months ago)

Self hosted AI seems like an intriguing option for those capable of running it. Naturally this will always be more complex than paying someone else to host it for you but it seems like that's that only way if you care about privacy

https://github.com/mudler/LocalAI

48
submitted 5 months ago* (last edited 5 months ago) by Lem453@lemmy.ca to c/selfhosted@lemmy.world

Technically this isn't actually a seafile issue, however the upload client really should have the ability to run checksums to compare the original file to the file that is being synced to the server (or other device).

I run docker in a VM that is hosted by proxmox. Proxmox manages a ZFS array which contains the primary storage that the VM uses. Instead of making the VM disk 1TB+, the VM disk is relatively small since its only the OS (64GB) and the docker containers mount a folder on the ZFS array itself which is several TBs.

This has all been going really well with no issues, until yesterday when I tried to access some old photos and the photos would only load half way. The top part would be there but the bottom half would be grey/missing.

This seemed to be randomly present on numerous photos, however some were normal and others had missing sections. Digging deeper, some files were also corrupt and would not open at all (PDFs, etc).

Badness alert....

All my backups come from the server. If the server data has been corrupt for a long time, then all the backups would be corrupt as well. All the files on the seafile server originally were synced from my desktop so when I open the file locally on the desktop it all works fine, only when I try to open the file on seafile does it fail. Also not all the files were failing only some. Some old, some new. Even the file sizes didn't seem to consistently predict if it would work on not.

Its now at the point where I can take a photo from my desktop, drag it into a seafile library via the browser and it shows successful upload, but then trying to preview the file won't work and downloading that very same file back again shows the file size about 44kb regardless of the original file size.

Google/DDG...can't find anyone that has the same issue...very bad

Finally I notice an error in mariadb: "memory pressure can't write to disk" (paraphrased).

Ok, that's odd. The ram was fine which is what I assumed it was. HD space can't be the issue since the ZFS array is only 25% full and both mariadb and seafile only have volumes that are on the zfs array. There are no other volumes...or is there???

Finally in portainer I'm checking out the volumes that exist, seafile only has the two as expected, data and database. Then I see hundreds of unused volumes.

Quick google reveals docker volume purge which deletes many GBs worth of volumes that were old and unused.

By this point, I've already created and recreated the seafile docker containers a hundred times with test data and simplified the docker compose as much as possible etc, but it started working right away. Mariadb starts working, I can now copy a file from the web interface or the client and it will work correctly.

Now I go through the process of setting up my original docker compose with all the extras that I had setup, remake my user account (luckily its just me right now), setup the sync client and then start copying the data from my desktop to my server.

I've got to say, this was scary as shit. My setup uploads files from desktop, laptop, phone etc to the server via seafile, from there borg backup takes incremental backups of the data and sends it remotely. The second I realized that local data on my computer was fine but the server data was unreliable I immediately knew that even my backups were now unreliable.

IMHO this is a massive problem. Seafile will happily 'upload' a file and say success, but then trying to redownload the file results in an error since it doesn't exist.

Things that really should be present to avoid this:

  1. The client should have the option to run a quick checksum on each file after it uploads and compare the original to the uploaded one to ensure data consistency. There should probably be an option to do this afterwards as well as a check. Then it can output a list of files that are inconsistent.
  2. The default docker compose should be run with health checks on mariadb so when it starts throwing errors but the interface still runs, someone can be alerted.
  3. Need some kind of reminder to check in on unused docker containers.
96
submitted 6 months ago by Lem453@lemmy.ca to c/selfhosted@lemmy.world

Looking for a self hosted YouTube front end with automatic downloader. So you would subscribe to a channel for example and it would automatically download all the videos and new uploads.

Jellyfin might be able to handle the front end part but not sure about automatic downloads and proper file naming and metadata

30
submitted 10 months ago* (last edited 10 months ago) by Lem453@lemmy.ca to c/selfhosted@lemmy.world

Very solid price, the cheapest I've seen for something like this. Has anyone tried it with OPNsense or other software?

The linked thread talks about someone getting 60C load temps but the air was 37C and they are using a RJ45 DAC which are known to use lots of power.

Wondering if anyone else has experience with this. Seems like a big advancement in what's possible at a home scale for non second hand equipment.

Another article about this: https://liliputing.com/this-small-fanless-pc-is-built-for-networking-with-four-10-gbe-and-five-2-5-gb-ethernet-ports/

[-] Lem453@lemmy.ca 29 points 10 months ago

"i swear it's not a lot"

Goes on the describe an infrastructure setup comparable to most medium sized businesses

I love this community!

[-] Lem453@lemmy.ca 41 points 11 months ago

Enshittification is the norm for all for profit endeavours. Exceptions to this are exceedingly rare and usually just a matter of time rather than truly being an exception.

[-] Lem453@lemmy.ca 31 points 1 year ago* (last edited 1 year ago)

Even better, cron job every 5 mins and if total remaining space falls to 5% auto delete the file and send a message to sys admin

[-] Lem453@lemmy.ca 75 points 1 year ago* (last edited 1 year ago)

Whatever country you are from, think about the most backwards/rural/remote location that has people with backwards and regressive views.

Consider that India likely has 5 to 10x the population of rural folks in small villages that have midevil views compared to your country.

Add in long standing cultural misogynist views that is so pervasive it fully permeates all aspects of government and life and you quickly get to a point where abuse and domestic violence is tolerated.

As per usual, it's not like the entire country shares these views. It's just the the number of backwards views in India is numerically huge because of its population.

[-] Lem453@lemmy.ca 42 points 1 year ago

Pornhub would make more money simply starting a vpn service rather than try to gather IDs

view more: next ›

Lem453

joined 1 year ago