• 5 Posts
  • 29 Comments
Joined 1 year ago
cake
Cake day: June 29th, 2023

help-circle
  • Oh, I backup religiously since Blue failed right after I moved and backup my backups on my laptop as well. (literally failed; I lost everything and had to run photorec and three other tools to pick out everything I’d done for the previous six months, since that I hadn’t copied to a backup on my server because I was prepping to move at the time).

    So far, OTBR is the biggest stopping issue since HA runs it but nothing sticks. I admit, moving zwave is my actual biggest dread; zigbees I can do probably in a weekend, but zwave is such hell to unpair and re-pair (thought it makes up for it by sticking forever). That’s part of the reason I love Thread and Matter; they’re almost as sticky as zwave once they pair, and while pairing them is variable (sometimes fast, sometimes not so much) they repair themselves pretty consistently if the outage is under 24 hours and you can deliberately unpair them fairly easily.


  • I’ve been running Home Assistant for roughly five-six years (Pi, then Blue, now Amber and a second instance on my server for network integrations like nmap and netgear), but since my SmartThings hub was taking care of zigbee/zwave, until now I used HA as a coordinator for every smart device ecosystem I was using (Hue, Wyze, Ring, Blink, Alexa, August, Arlo, et al). Sorry that wasn’t clear.

    While Ive started slowly adding zigbee devices directly, I haven’t started with zwave and thread isn’t working for me yet (OTBR is running but nothing sticks). And I really don’t want to have my hub fail and all my thread/matter devices useless when I don’t have anything that can access them.







  • So it can be done, it just–required a lot of steps and me making a mapping spreadsheet of all the containers. But! Automations and scripts run in the homeassistant container, while when you ssh, you’re going into the ssh addon container which should have been obvious and really was once I finished mapping all the containers.

    Goal: I need /usr/local/bin in the ssh container so I can run scripts over ssh and access my function library script easily without ./path/to/script.

    Summary: ssh into HAOS from the homeassistant container with an HAOS root user (port 22222), run docker exec to get into the ssh addon container, then make your symlinks for /usr/local/bin.

    (Note: this is ridiculously complicated and I know there has to be a better way. But this works so I win.)

    1. Get access to HAOS itself as root: https://developers.home-assistant.io/docs/operating-system/debugging. Verify you can login successfully.
    2. In homeassistant container:
    • a. create an .ssh folder (/config/.ssh)
    • b. add the authorized_keys file you made for step one.
    • c. add the public and private keys you made for step one (should be in the ssh addon container).
    • d. set permissions;
    chmod 600 /config/.ssh/authorized_keys
    chmod 600 /config/.ssh/PRIVATE_KEY
    chmod 644 /config/.ssh/PUBLIC_KEY
    chmod 700 /config/.ssh
    
    • e. In /config/shell_scripts.yaml or wherever you put your shell scripts, add the script you want to use to update /usr/local/bin: UPDATE_BIN_SCRIPT: /config/shell_scripts/UPDATE_BIN_SCRIPT
    • f. Restart HA.
    • g. Check it in Developer Tools->Services

    I have no idea how consistent the ssh addon container name is usually but it’s different on all three of my installs, so insert your container name for SSH_ADDON_CONTAINER_NAME

    Steps: login to HAOS, go into the SSH Container, and do the update. This is horribly messy but hey, it works.

    UPDATE_BIN_SCRIPT

    #!/bin/bash
    
    # OPTIONAL: Update some of the very outdated alpine packages in both homeassistant and the ssh addon (figlet makes cool ascii art of my server
    # name).   You'll need to run it twice; once for the homeassistant container, then again in the ssh container.  Assuming you want to update packages,
    # anyway
    # update homeassistant container packages
    apk add coreutils figlet iproute2 iw jq ncurses procps-ng sed util-linux wireless-tools
    
    # ssh into HAOS and access docker container
    ssh -i /config/.ssh/PRIVATE_KEY -p 22222 root@HA_IP_ADDRESS << EOF
    	docker exec SSH_ADDON_CONTAINER_NAME \
    	bash -c \
           'apk add coreutils figlet iproute2 iw jq ncurses procps-ng sed util-linux wireless-tools; \
    	if [ ! -h /usr/local/bin/SCRIPT1 ]; then echo "SCRIPT1 does not exist"; \
    	ln -s /homeassistant/shell_scripts/SCRIPT1 /usr/local/bin/SCRIPT1; echo "Link created"; \
    	else echo "Link exists";fi; \
    	if [ ! -h /usr/local/bin/SCRIPT2 ]; then echo "SCRIPT2 does not exist"; \
    	ln -s /homeassistant/shell_scripts/SCRIPT2 /usr/local/bin/SCRIPT2; echo "Link created"; \
    	else echo "Link exists";fi'
    EOF
    
    echo "Done"
    

    I am going to feel really stupid when I find out there’s a much easier way.


  • Docker containers are designed to be immutable. The moment they’re stopped and recreated, any changes to them ads thrown out. You’re supposed to add a layer to your Docker image if you want to add command lines and such. That’s why it’ll keep deleting your stuff every time you update.

    It took me until I put Home Assistant on my server in a docker container to realize what was going on there. I use docker more now, but it’s really, really nothing like this.

    Running the script inside Docker should put it in the right place, but I wouldn’t advice doing it that way.

    That’s what I’ve been doing manually over regular ssh (not the 22222 port one).

    To work around the path issue, maybe consider using hard links rather than soft links?

    That’s what I think I need to do, but the only ‘hard’ links–at least according to multiple find -name/find -iname searches on the ssh 22222 port–are all in /mnt/data/docker/overlay2 and /var/lib/docker/overlay2. I get there’s a working pattern with the overlays but dear God why.

    Alternatively, you could figure out where HAOS stores the Docker config and add a volume definition of your own. You’ll probably be able to put all of your files in /usr/local/bin by adding a line like “- /path/home/host:/usr/local/bin” in the right place. I don’t know where this config is stored, though.

    Okay that makes sense. I guess the first step is to get the container structure and volume.

    Thanks so much! I’ll update if I find the solution or die trying.






  • Logically, I want to say no, not really, but I also would have thought the blackout and ongoing protests wouldn’t really affect Reddit and they’d ignore it. Reddit itself, however, seems incredibly determined to pursue a course of action which requires performing This Does Not Affect Us At All as dramatically and publicly as possible given the slightest opportunity whether anyone cares or not. This doesn’t even include the admins playing subreddit roulette that encompasses actively rebelling subs, subs deep in malicious compliance, and subs that have no idea wtf is going on they just want to talk about their weird NSFW fetish in peace.

    So no, I don’t think so, but I’m beginning to wonder if Reddit thinks there is and what they’re seeing on their side that I’m not.


  • I semi-regularly distro-hop, but Xubuntu is the distro I keep coming back to between hops to take a break or when one goes (temporarily) dormant. It’s currently running on my primary server/linux machine.

    Reasons: 1.) It’s light on resources 2.) It’s very simple and clean. 3.) It works with all the programs I use regularly; only one needs to be hand-compiled (but that one has to be compiled for literally any Linux machine). 4.) I know it. Scrub/partition/install/configure in under an hour. I can pick up any of my projects again immediately where I left off.


  • The only reason I have social media accounts under my wallet name is to avoid anyone wondering why I’m not on social media (also: grandparents). Everyone IRL who I care enough about to actually explain know I login once a year in a separate browser (under incognito) and check every privacy setting from my checklist and update if it’s important (like job change). LinkedIn I check regularly, but that’s because a.) I only connect with people from work and a lot of them do think it’s important to have strong networks (and they could be right, no idea) and b.) LinkedIn has an education section that my job really likes because it has free classes and when I get bored at work, I can do a quick class in something (nothing they actually want us to do; I have to work in the nightmare that is Agile, do not make me take yet another class about the benefits of this software development hellscape, thanks).

    Honestly, I try to give the impression I’m not into social media IRL; there are like, three people in my daily life who are allowed into my online life and one because we more or less both got the internet at the same time and started a mailing list together. Don’t get me wrong, I know a lot of nice people IRL, but not the type I want to introduce to the friends I made online.



  • Seperis@lemmy.worldtoLinux@lemmy.worldWhy I prefer Linux
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    TP-Link AC600

    Oops, this was meant as a reply to someone about the TP-Link AC2100 router in anothrr window, ugh. Too many google results open.

    Let me google the chipset for that one if you haven’t found drivers that work yet. For some of the Realtek based ones, there’s some you can compile yourself by morrownr.



  • Seperis@lemmy.worldtoLinux@lemmy.worldWhy I prefer Linux
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    God, I just did the set up new laptop process on Sunday; I completely forgot how insanely long everything takes to set up, update, configure, etc. Linux SBC, maybe an hour end to end; install, update, all my configs neatly in a file, ready to be copied over. Regular Linux: two hours end to end at most. You just do not appreciate the beauty of apt update/apt install quite as much as the moment you are confronted with a new Windows install.

    Windows? Pretty much most of Sunday afternoon and evening. First the Dell updates, then the driver updates, then the pre-installed program updates, then the Windows updates (though not in that order and not all at once, because predictability what is that). Then I could actually start adding my programs and configuring it, and oh boy.

    Just my base configuration for Office–that being each individual program in the suite, God knows–required a hunting expedition and a lot of googling to track everything down in multiple locations and I still had to do a lot of it manually; putty and kitty required copying bits of the registry; calibre I gave up as it was less work to do it myself from memory; firefox was the only thing I could just copy and paste a folder and be entirely done. That part was nice. Every other program I needed I had to track down and install separately then hunt up configs in multiple locations and Windows kept interrupting the process because oh, we forgot, here’s more updates and one to three restarts. Why?

    And Windows 11’s start menu is just insulting; talk about salt in the wound.


  • I started vaping seven years ago as a way to quit smoking; I smoked my last cigarette literally outside the vape store before walking in and asking what to I buy to pull this off as nothing worked. The transition was seamless; not only did I never even crave a cigarette again, I very quickly learned to loathe the smell of cigarettes once my full range of smell came back. There’s not even a temptation to start up again.

    It also helps that I choose vapes that smell amazing.

    I am still vaping, yes, but I’m stepping down my nicotine pretty much every two years. I started at 24 and am now at 15 (I was stuck at 18 for a while). Those transitions I can definitely feel, but I can start with adjusting my mod’s wattage, air flow, use different coils for a bit, and ease into it so once I step down, there’s no chance I step back up, and then reward myself sometimes with a new fancy mod with a touchscreen with more leds or a cooler tank or something. All that and I am spending an order of magnitude less than I ever did on cigarettes and I have the math to prove it.

    It’s certainly not ideal and yeah, it’s slow and basically only progressively reducing harm, but it’s a process that for me is guaranteed to work with no backtracking and progress is assured.


  • Oh thank God. Normally I know how to read (since kindergarten) but in the time between posting and your reply, I hit a very unwilling thirty-six hours awake so I low-grade panicked that actually, it only read normal to me and I was lecturing people on becoming a vegan fascists or something.

    I am still thinking on the article but it’s going to need a couple of times to put it in context. I’m still trying not to form really firm opinions on much yet on Fediverse since I seriously do not know enough and yes, even I find it hilarious when I have to backtrack from a really stupid position, but I can save public embarrassment for later. Lemmy’s still young, I have plenty of time for that.




  • I’m a QC analyst and we are fully Agile, so I’m required to attend ever. team. meeting. Discovery, story point estimation, design spikes, any day can be poorly handled emotional regulation day and whoever’s feeling it is making it everyone’s problem when all we want is to finish a few maintenance items and maybe add a comma to some text. Though the testers have nothing to do with this after story point until there actual code migrated to one of the testing environments, we are forced to bear witness to entire dev teams made up of people from three to eight countries, whose only common language is English and as often the only native speaker, I am the only one who can’t mutter not very goddamn quietly in my native tongue that no one else understands; this may have been my motivation at one point to learn Welsh on Duolingo. A Project Manager making three times more than anyone else in the room sometimes swoops in during SCRUM two weeks into our sprint cycle to be perky at us and–on far too many occasions for this to be random–informs us the acceptance criteria had a couple of updates before swooping back out to PM something else’s life. We all hate her quietly until someone who went to check JIRA notes there are double the number of criteria and the user story is not the same in any way;. then everyone but me gets to hate her verbally with no one the wiser. I maintain bitterly grudging silence because everyone in the room speaks English, sometimes better than I do, and they have been in Texas long enough to pickup conversationally hostile Spanish. Our scrum master will either grimly pretend it’s always been this way or very blatantly not care.

    At final demo as the tester, I will perform a dramatic rendition of ‘page with comma’ and ‘title:justfication left’ or run batch scripts in terminal while they watch absolutely nothing happening and nod wisely. Half the people in attendance wears suits for a living and have never used a computer; they have secretaries for that. Two worked with my mom and are quietly judging my performance and find me lacking. One stakeholder will ask a thousand questions, five of which have any relation to what we’re doing and I am expected to answer with no discernible change in my performance. Someone is watching TV and can’t be fucked to turn down the volume. Everyone else sits in eerie silence and I might hear a snore. Every one of these people are considered qualified enough to decide if we’re did a good job and sign off on it so we can finally end the sprint and the code can be added to the next release to production. No one feels a sense of relief or satisfaction; at least one dev hasn’t slept since the PM destroyed our lives and may be clinically insane.

    Our sprints last four weeks with a prep week in between; we will experience some version of this cycle of dev hell roughly eight times a year and sometimes involving the legislature making their lack of time management all of our problem. Only one sprint will go as planned. One.

    The worst part is; despite this, knowing full well what hell is before me, I went back to college for software development of my own free will.