[-] IsoKiero@sopuli.xyz -1 points 1 day ago

Ukrainian people protesting against joining Russia and moving further away from EU was US operation to overthrow sitting president? Yeah, right.

US had nothing to do with it. If anything, Russia helped the whole thing to happen, or maybe even created it, by pushing against Ukrainian parliament and opinion of Ukrainian people.

[-] IsoKiero@sopuli.xyz -4 points 3 days ago

Putin can quite easily help. Just pull all of your troops well behind 1991 borders, stop the nonsense with 'areas voted to join Russia' and fly to Hague. Ukraine will happily arrange elections for new president once that's done.

[-] IsoKiero@sopuli.xyz 2 points 6 days ago

You're correct. All packages installed via dpkg/apt are on that list. What isn't included are appimages, flatpacks, snaps and other non-dpkg software if you happen to have any.

[-] IsoKiero@sopuli.xyz 2 points 6 days ago

Nothing has these 'glyph' lights, but software support for them seems to be pretty limited (not that I've looked too deep, all these modern phones are just boring). I hope they can come up with more controls for them, hardware seems to support quite complex patterns.

[-] IsoKiero@sopuli.xyz 81 points 2 months ago

Their bases have been about where the tent villages now are for decades. They're training grounds for new conscripts until they're moved to die in some ditch in Ukraine. Who knows why they're more active now, maybe Ukraine is getting pretty good to hit their targets deep in Russia so they need to move further away from the front line or whatever.

This has absolutely nothing to do with Finland, beyond the fact that our border just happens to be nearby. And should they actually try start an active war with NATO from there, these grounds are mostly in reach of Finnish artillery and our artillery is pretty damn efficient on what they do.

110
submitted 4 months ago by IsoKiero@sopuli.xyz to c/technology@lemmy.world

So, Alec over the Technology Connections channel made an hour long video explaining the difference with kW and kWh (obviously with other stuff around it).

I'm living in northern Europe in an old house, with pretty much only electric appliances for everything. We do have a wood stove and oven, but absolute majority of our energy consumption is electricity. Roughly 24 000 kWh per year.

And, while eveything he brings up makes absolute sense, it seems like a moot point. In here absolutely everyone knows this stuff and it's all just common knowledge. Today we went into sauna and just turned a knob to fire up the 6,5kW heaters inside the stove and doing that also triggered a contactor to disengage some of the floor heating so that the thing doesn't overload the circuit. And the old house we live in pulls 3-4kW from the grid during the winter just to keep inside nice and warm. And that's with heat pumps, we have a mini-split units both on the house and in the garage. And I also have 9kW pure electric construction heater around to provide excess heat in case the cheap minisiplit in garage freezes up and needs more heat to thaw the outside unit.

And kW and kWh are still commony used measurement if you don't use electricity. Diesel or propane heaters have labels on them on how many watts they can output right next to the fuel consumption per hour and so on. So I'm just wondering if this is really any new information for anyone.

I assume here's a lot of people from the US and other countries with gas grid (which we don't really have around here), is it really so that your Joe Average can't tell the difference between 1kWh of heat produced by gas compared to electricity? I get that pricing for different power sources may differ, but it's still watt-hours coming out of the grid. Optimizing their usage may obviously be worth the effort, but it's got nothing to do with power consumption.

So, please help me understand the situation a bit more in depth.

1
submitted 6 months ago by IsoKiero@sopuli.xyz to c/selfhosted@lemmy.world

So, I've been pushing my photos to local immich-instance and I'll need some kind of file storage too soon, total amount of data is roughly 1,5TB.

Everything is running on a proxmox server and that's running somewhat smoothly, but now I'd need to get that backed up offsite. I'm running a VPS at Hetzner and they offer pretty decently priced S3 storage or 'storagebox' which is just a raw disk you can connect via SMB/NFS and others.

Now, the question is, how to set up automated backups from proxmox to either of those solutions? I suppose I could just mount anything to the host locally and set up backup paths accordingly, but should the mount drop for whatever reason is proxmox smart enough to notice that actual storage is missing and not fill small local drive with backups?

Encryption would be nice too, but that might be a bit too much to ask. I have enough bandwidth to manage everything and after initial upload the data doesn't change that much, the only question is what is the best practise to do it?

12

In my house I have 3 circuits of floor heating elements, each is connected to single phase (230V) 16A fuse and on one of them I have Heatit Z-TRM3 connected via z-wave to my home assistant. Others are manually controlled dumb thermostats.

That thing works, but at least the particular one I got causes a lot of interference to the z-wave network, so I'm a bit hesistant to add any more of those.

Features I must have:

  • Option for a floor temperature sensor. Each thermostat has separate pipe going to the floor and the floor sensor is easy enough to replace, but it is a must have option
  • Air temperature sensor. 2/3 of the heating elements are in a concrete slab and that means that measured temperature of the slab very slowly affects the air temperature, so I need both. ESP32 or equivalent as a separate sensor might be a decent workaround, but I'd rather have both on a single unit.
  • Obviously the 230V 16A capability as that's what they're wired on and even if I don't have 3kW elements on the floor it's what's needed to meet the code
  • Manual controls on the device itself. Should my raspberry pi running home assistant kick the bucket or some other major issue with the automations happens, I still need an option to control the device. And that's a strict requirement, no bluetooth apps on the phone or anything, I must have manual buttons or some other way to control the thing without home assistant or any other smart device.
  • And addition to previous one: No cloud requirement. Allowing the device to the internet for a setup is fine, but in the long run it must be happy in a isolated network without internet connectivity

For the communication I don't really care. I currently have only wifi/z-wave as an option, but if there's something on zigbee which ticks all the boxes I can invest in a usb-dongle or a hub.

Price is obviously a concern, but it's hard to set any strict boundaries. I won't throw a 1000€ for a thermostat, but anything even remotely reasonable goes.

What are your suggestions for a situation like this?

32
submitted 8 months ago by IsoKiero@sopuli.xyz to c/selfhosted@lemmy.world

So, as the topic says, I'm going to set up a self hosted email service for myself, family and friends. I know that this one is a controversial topic around here, but trust me when I say I know what I'm getting into. I've had a small hosting business for years and I've had my share of issues with microsoft and others, I know how to set things up and keep them running and so on.

However, on the business side we used both commercial solution and a dirt-cheap service with just IMAPS/SMTPS and webmail with roundcube. Commercial one (Kerio Connect, neat piece of software, check it out if you need one) is something I don't want to pay for anymore (even if their pricing is pretty decent, it's still money out from my pocket).

I know for sure I can rely to bog-standard postfix+dovecot+spamassassin -combo, and it will work just fine for plain email. However, I'd really like to have calendar and contacts in the mix as well and as I've only worked with commercial solution for the last few years I'm not up to speed on what the newest toys can offer.

I'm not that strict on anything, but the thing needs to run on linux and it must have the most basic standards supported, like messages stored on maildir-format (simplifies migration to other platform if things change), support for sieve (or other commonly supported protocol) and contacts/calendar need to work with pretty much anything (android, ios, linux, windows, mac...) without extra software on client end (*DAV excluded, those are fine in my books). And obviously the thing needs to work with imaps, smtps, dkim and other necessities, but that should be implied anyways.

I know that things like zimbra, sogo and iredmail exist, but as mentioned, it's been a while since I've played with things like that, so what are your recommendations for setup like this today?

[-] IsoKiero@sopuli.xyz 53 points 9 months ago

This is the same as complaining that my job puts a filter on my work computer that lets them know if I’m googling porn at work. You can cry big brother all you want, but I think most people are fine with the idea that the corporation I work for has a reasonable case for putting monitoring software on the computer they gave me.

European point of view: My work computer and the network in general has filters so I can't access porn, gambling, malware and other stuff on it. It has monitoring for viruses and malware, that's pretty normal and well understood need to have. BUT. It is straight up illegal for my work to actively monitor my email content (they'll of course have filtering for incoming spam and such), my chats on teams/whatever and in general be intrusive of my privacy even at work.

There's of course mechanisms in place where they can access my email if anyting work related requires that. So in case I'm laying in a hospital or something they are allowed to read work related emails from my inbox, but if there's anything personal it's protected by the same laws which apply to traditional letters and other communication.

Monitoring 'every word' is just not allowed, no matter how good your intentions are. And that's a good thing.

11

I'm not quite sure if electronics fit in with the community, but maybe some of you could point me into right direction with ESPHome and IR transmitter to control my minisplit heatpump at the garage.

The thing is cheapest one I could find (I should've paid more, but that's another story). It's rebranded cheap chinese crap and while vendor advertised that you could control it over wifi I didn't find any information beyond 'use SmartApp to remote control' (or whatever that software was called) but it's nowhere to be found and I don't want to let that thing into internet anyways.

So, IR to the rescue. I had 'infrared remote control module' (like this around and with arduino uno I could capture IR codes from the remote without issues.

But, transmitting those back out seems to be a bit more challenging. I believe I got the configuration in place and I even attempted to control our other heat pump with IR Remote Climate component which should have support out of the box.

I tried to power the IR led straight from nodemcu pin (most likely a bad idea) and via IRFZ44N mosfet (massive overkill, but it's what I had around) from 3.3V rail. The circuit itself seems to work and if I replace IR led with a regular one it's very clear that LED lights up when it should.

However, judging by the amount of IR light I can see trough cellphone camera, it feels like that either the IR LED is faulty (very much a possibility, what you can expect from a 1€ kit) or that I'm driving it wrong somehow.

Any ideas on what's wrong?

40
submitted 2 years ago by IsoKiero@sopuli.xyz to c/linux@lemmy.ml

I think that installation was originally 18.04 and I installed it when it was released. A while ago anyways and I've been upgrading it as new versions roll out and with the latest upgrade and snapd software it has become more and more annoying to keep the operating system happy and out of my way so I can do whatever I need to do on the computer.

Snap updates have been annoying and they randomly (and temporarily) broke stuff while some update process was running on background, but as whole reinstallation is a pain in the rear I have just swallowed the annoyance and kept the thing running.

But now today, when I planned that I'd spend the day with paperwork and other "administrative" things I've been pushing off due to life being busy, I booted the computer and primary monitor was dead, secondary has resolution of something like 1024x768, nvidia drivers are absent and usability in general just isn't there.

After couple of swear words I thought that ok, I'll fix this, I'll install all the updates and make the system happy again. But no. That's not going to happen, at least not very easily.

I'm running LUKS encryption and thus I have a separate boot -partition. 700MB of it. I don't remember if installer recommended that or if I just threw some reasonable sounding amount on the installer. No matter where that originally came from, it should be enough (this other ubuntu I'm writing this with has 157MB stored on /boot). I removed older kernels, but still the installer claims that I need at least 480MB (or something like that) free space on /boot, but the single kernel image, initrd and whatever crap it includes consumes 280MB (or so). So apt just fails on upgrade as it can't generate new initrd or whatever it tries to do.

So I grabbed my ventoy-drive, downloaded latest mint ISO on it and instead of doing something productive I planned to do I'll spend couple of hours at reinstalling the whole system. It'll be quite a while before I install ubuntu on anything.

And it's not just this one broken update, like I mentioned I've had a lot of issues with the setup and at least majority of them is caused by ubuntu and it's package management. This was just a tipping point to finally leave that abusive relationship with my tool and set it up so that I can actually use it instead of figuring out what's broken now and next.

5
submitted 2 years ago* (last edited 2 years ago) by IsoKiero@sopuli.xyz to c/homeassistant@lemmy.world

Maybe this hivemind can help out debugging Z-wave network. I recently installed two devices on the network (currently up to 15) with two repeaters, light switches, wall plugs, thermostat and couple battery operated motion sensors.

Before latest addition everything worked almost smoothly, every now and then the motion sensor messages didn't go trough, but it was rare enough that I didn't pay too much attention to it as I have plenty of other stuff to do than tinker with occasional hiccup on home automation.

However for the last 48 hours (or so) the system has become unreliable enough that I need to do something about it. I tried to debug the messages a bit, but I'm not too famliar on what to look for, however these messages are frequent and they seem to be a symptom of an issue:

Dropping message with invalid payload

[Node 020] received S2 nonce without an active transaction, not sure what to do with it

Failed to execute controller command after 1/3 attempts. Scheduling next try in 100 ms.

Specially the 'invalid payload' message appears constantly on the logs. I'd quess that some of the devices is malfunctioning, but other option is that there's somehow a loop on the network (I did attempt to reconfigure the whole thing, didn't change much) or that my RaZberry 7 pro is faulty.

Could someone give a hint on how to proceed and verify which the case might be?

Edit: I'm running Home Assistant OS on a raspberry pi 3.

7
submitted 2 years ago* (last edited 2 years ago) by IsoKiero@sopuli.xyz to c/homeassistant@lemmy.world

I've been trying to get a bar graph from nordpool electricity prices, but for some reason the graph style won't change no matter how I try to configure it.

I'm running Home assistant OS (or whatever that was called) on a raspberry pi 3:

  • Home Assistant 2023.10.1
  • Supervisor 2023.10.0
  • Operating System 10.5
  • Frontend 20231005.0 - latest

Currently my configuration for the card is like this:

type: custom:mini-graph-card
name: Pörssisähkö
entities:
  - entity: sensor.nordpool
    name: Pörssisähkö
    group-by: hour
    color: '#00ff00'
    show:
      graph: bar

But no matter how I try to change that the graph doesn't change and there's also other variables, like line graph with/without fill which doesn't work as expected. Granted, I'm not that familiar with yaml nor home assistant itself, but this is something I'd expect to "just work" as the configuration for mini-graph-card is quite simple. It displays correct data from the sensor, but only in a line format.

Is this something that recent update broke or am I doing something wrong? I can't see anything immediately wrong on any logs nor javascript console

211

cross-posted from: https://derp.foo/post/250090

There is a discussion on Hacker News, but feel free to comment here as well.

22
submitted 2 years ago* (last edited 2 years ago) by IsoKiero@sopuli.xyz to c/selfhosted@lemmy.world

This question has already been around couple of times, but I haven't found an option which would allow multiple users and multiple OS's (Linux and Windows mostly, mobile, both android and ios, support would be nice at least for viewing) to conviniently share the same storage.

This has been an issue on my network for quite some time and now when I rebuilt my home server I installed TrueNAS on a VM and I'm currently organizing my collections over there with Shotwell so the question became acute again.

Digikam seems to be promising for the rest than organizing the actual files (which I can live with, either shotwell or a shell script to sort them by exif-dates), but I haven't tried that yet with windows and my kubuntu desktop seems to only have snap-package of that without support for external SQL.

On "editing" part it would be pretty much sufficient to tag photos/folders to contain different events, locations and stuff like that, but it would be nice to have access to actual file in case some actual editing needs to be done, but I suppose SMB-share on truenas will accomplish that close enough.

Other need-to-have feature is to manage RAW and JPG versions of the same image at least somehow. Even removing JPGs and leaving only RAW images would be sufficient.

And finally, I really like to have the actual files laying around on a network share (or somewhere) so that they're easy to back up, copy to external nextcloud for sharing and in general have more flexibility in the future in case something better comes up or my environment changes.

view more: next ›

IsoKiero

joined 2 years ago