• 2 Posts
  • 67 Comments
Joined 2 years ago
cake
Cake day: December 12th, 2023

help-circle
  • My server mysteriously stopped working in December. After a scheduled restart, the OS wouldn’t load so the fan was running on high for a few days while I was staying at a friends for a few days.

    I checked the logs and couldn’t find anything suspicious. Loaded a previous backup that worked and still nothing loaded on startup. Tested the Pi 5 with a USB drive that had a fresh Alpine Linux install on it and everything loaded up fine so I was able to rule out any hardware issues. The HDD with the old OS mounted just fine to my laptop. I still have no idea what happened.

    This happened a few days before my domain name expired and I was planning to change my domain name to something shorter. Decided to hold off on remaking my server from scratch until I finish a few other projects.

    The other projects will help me manage my network connected devices so it’s all working towards a common goal. Fortunately I am getting very close to finishing those projects. I am putting the final touches on my last project and should done within a few days.

    Next I’ll reinstall my Pi 4 with HomeAssistant again to fix it’s networking issue. Only the terrarium grow lights are affected and my gecko chose to hibernate outside of the terrarium this winter so she’s unaffected (heat lamps are controlled by a separate, isolated device). After that I’ll fix my Pi 5 server and this time go with Podman over Docker.


  • It’s been a learning experience. I am pretty much building it specifically for my use so it’s missing lots of stuff that’s standard on other fully featured OS’s. I’m mostly using a browser, Konsole terminal and KDE Kate as my editor.

    I found an unexpected hobby in writing POSIX scripts because it’s teaching me the inner workings of Linux. In the future I’d like to expand that to including the sed and awk commands but I haven’t really found a project to use them with yet.

    Alpine Linux does offer a setup-desktop command/script which will easily setup a few desktop environments such as Gnome, Plasma, Xfce, Mate, Sway and Lxqt. That only sets up the basic desktop environment so a lot of other work is needed to set up things like sound, graphics and a few other things.

    As I mentioned before, I still have Linux Mint DE installed. I mostly use it for Steam games but it has everything else I need for when I don’t feel like working out a problem because I wanted to simply open a .pdf file. However, it’s still really satisfying for me setting up a very specific work environment with the mininal tools I have available.

    I think there is value in learning to work within a mininal environment to help give more life out of lower spec technology that’s currently available. Especially now with all the ram supply issues because of the AI rush.



  • I don’t have any answers, just my own experiences. Last year I decided to use Alpine Linux as my Operating System for a couple of self-hosted things running on a Raspberry Pi. I chose it because it’s super minimal and used less common tools (for example doas instead of sudo). That unintentionally forced me to learn how to use Linux using more basic commands that are more likely to be available on other Linux systems.

    Alpine Linux uses Busybox-Ash which is a POSIX compliant shell that’s very small and very basic. The scripts I ended up writing tend to be POSIX portable meaning that they should work on a wider variety of systems. That comes at the cost of script simplicity and readability as well as missing out on many features that make Bash scripts more complex, robust and easier to work with.

    I have a working example POSIX portable script. I’ve been adding to it all the things I’ve learned. You can check it out here if you’re interested.

    I use Alpine Linux with Sway as my daily driver but still keep a copy of Linux Mint DE ready to use because it’s nice to have a fully featured work environment for the days I don’t want to think.




  • I created a file tree that looks similar to my system’s file tree, except it only contains all the files that I modified or added and only their respective directories. From there I just use rsync to sync those files/file tree to the system’s /.

    It’s convenient to see what changes I currently have but it requires a bit of manual maintenance. I only really started doing it that way because I was learning how to use rsync and I just kept going on with it because it was working for me.

    I’m only working with my laptop, android phone and two Raspberry Pi’s so I can get with my little rsync based setup.


  • I had a website that was set up for only my personal use. According to the logs the only activity I ever saw was my own. However, it involves a compromise. Obscurity at the cost of accessibility and convenience.

    First, when I set up my SSL cert, I chose to get a wildcard subdomain cert. That way I could use a random subdomain name and it wouldn’t show up on https://crt.sh/

    Second, I use an uncommon port. My needs are very low so I don’t need to access my site all the time. The site is just a fun little hobby for myself. That means I’m not worried about accessing my site through places/businesses that block uncommon ports.

    Accessing my site through a browser looks like: https//randomsubdomain.domainname.com:4444/

    I’m going on the assumption that scrapers and crawlers are going to be searching common ports to maximize the number of sites they can access over wasting their time on searching uncommon ports.

    If you are hosting on common ports (80, 443) then this isn’t going to be helpful at all and would likely require some sort of third party to manage scrapers and crawlers. For me, I get to enjoy my tiny corner of the internet with minimal effort and worry. Except my hard drive died recently so I’ll pick up again in January when I am not focused on other projects.

    I’m sure given time, something will find my site. The game I’m playing is seeing how long it would take to find me.


  • There’s a few things I backup from my phone.

    • Music downloaded from Seeker
    • Youtube audio downloaded from YTDLnis
    • Backups of Termux
    • Notes in plain text
    • Backups from certain apps that make their own backup data
    • Pictures that I have sorted and want to saved

    I have an Android phone so I use Termux as a terminal emulator. I use ssh and passwordless keys to make transfers simpler and quicker.

    Although this is closer to a backup process and not like SyncThing where it’s syncing a folder between two devices. I don’t believe rsync is capable of acting like SyncThing but I’m tempted to dig into rsync more and see if I can put something basic together one day.



  • I have two systems that sort of work together.

    The first system involves a bunch of text files for each task. OS installation, basic post OS installation tasks and a file for each program I add (like UFW, apparmor, ddclient, docker and so on). They basically look like scripts with comments. If I want to I can just copy/paste everything into a terminal and reach a a specific state that I want to be at.

    The second system is a sort of “skeleton” file tree that only contains all the files that I have added or modified.

    Here's an example of what my server skeleton file tree looks like
    .
    ├── etc
    │   ├── crontabs
    │   │   └── root
    │   ├── ddclient
    │   │   └── ddclient.conf
    │   ├── doas.d
    │   │   └── doas.conf
    │   ├── fail2ban
    │   │   ├── filter.d
    │   │   │   └── alpine-sshd-key.conf
    │   │   └── jail.d
    │   │       └── alpine-ssh.conf
    │   ├── modprobe.d
    │   │   ├── backlist-extra.conf
    │   │   └── disable-filesystems.conf
    │   ├── network
    │   │   └── interfaces
    │   ├── periodic
    │   │   └── 1min
    │   │       └── dynamic-motd
    │   ├── profile.d
    │   │   └── profile.sh
    │   ├── ssh
    │   │   └── sshd_config
    │   ├── wpa_supplicant
    │   │   └── wpa_supplicant.conf
    │   ├── fstab
    │   ├── nanorc
    │   ├── profile
    │   └── sysctl.conf
    ├── home
    │   └── pi-user
    │       ├── .config
    │       │   └── ash
    │       │       ├── ashrc
    │       │       └── profile
    │       ├── .ssh
    │       │   └── authorized_keys
    │       ├── .sync
    │       │   ├── file-system-backup
    │       │   │   ├── .sync-server-fs_01_root
    │       │   │   └── .sync-server-fs_02_boot
    │       │   └── .sync-caddy_certs_backup
    │       ├── .nanorc
    │       └── .tmux.conf
    ├── root
    │   ├── .config
    │   │   └── mc
    │   │       └── ini
    │   ├── .local
    │   │   └── share
    │   │       └── mc
    │   │           └── history -> /dev/null
    │   ├── .ssh
    │   │   └── authorized_keys
    │   ├── scripts
    │   │   ├── automated-backup
    │   │   └── maintenance
    │   ├── .ash_history -> /dev/null
    │   └── .nanorc
    ├── srv
    │   ├── caddy
    │   │   ├── Caddyfile
    │   │   ├── Dockerfile
    │   │   └── docker-compose.yml
    │   └── kiwix
    │       └── docker-compose.yml
    └── usr
        └── sbin
            ├── containers-down
            ├── containers-up
            ├── emountman
            ├── fs-backup-quick
            └── rtransfer
    

    This is useful to me because I can keep track of every change I make. I even have it set up so I can use rsync to quickly chuck all the files into place after a fresh install or after adding/modifying files.

    I also created and maintain a “quick install” guide so I can install a fresh OS, rsync all the modified files from my skeleton file tree into place, then run through all the commands in my quick install guide to get myself back to the same state in a minimal amount of time.


  • I actually started with RPi’s. The first one, a used Pi 4b, is dedicated only to HomeAssistant. I don’t tinker with it anymore because it does what I want and I don’t want unexpected downtime when I have to use the bathroom or use the lights in my room.

    I bought a used Pi5 with the intention of upgrading later. In life I am quite minimal and find a joy in using what little tools and material I have to create something new. That seems to hold true to technology and scripting too. The RPi5 with an old USB3 HDD is actually way more power than I can currently use and can imagine using for a long time. The extra room to work is convenient though.

    I’ll have a look into some of the places you suggested, those seem like the places to draw good inspiration from, thank you.


  • I started out rewriting my network backup scripts only to realize I was adding functionality to a previous script I wrote to automatically mount and dismount luks encrypted volumes. I still want to type in my luks passphrase because I don’t want everything automated and prefer to include inconvenience as an additonal security measure in securing some of my data.

    I also came to the realization recently that the reason I don’t relate strongly to other self hosters is because I’ve unknowingly been trying to create a minimal self hosted system that is more beneficial to small, low powered devices.

    I’ve been using Alpine Linux, I install only the bare, older but well established tools and have been creating scripts soley based off those tools instead of seeking out bigger, more complicated modern tools. For example creating workflows by only using rsync or using https://github.com/RayCC51/BashWrite to create a blog that only uses bash and GNU sed to create a static blog site.

    At least now that I’m aware of this, I can keep an eye out for such projects or communities and would hopefully be able to contribute something in that direction.


  • I’ve experienced gatekeeping issues long before I got into self-hosting specifically. Years ago I wanted to learn C++ for Arduino and I was constantly talked down for asking questions.

    “Why don’t you just do …” in response to a question feels very rude as a newcomer because it feels like I am being talked down to for not knowing what others already know. Even when I made an effort to show I was making an effort to learn on my own, I was still belittled.

    I’m all for hearing different ways of approaching my issue but from the replies, it often feels like other people insist there is only one true specific way to handle an issue.

    When I first got into self-hosting, people kept pushing Cloudflare on me. When I expressed concern over a large centralized corporation having that much control and how they might have service issues, I was mocked really hard. Half a year later and there was a significant outage and suddenly there’s all this talk about how centralized the internet is and how that is bad.

    After that I took it upon myself to find alternative ways to protect myself without Cloudflare’s services but every step of the way has been an isolating experience. Every step of the way has been full of people saying that my efforts are pointless and that the bots will win anyways so I shouldn’t bother.

    I decided to try to secure myself through multiple layers of obscurity and every question in that direction has been full of people saying that obscurity is not security, the bots will find you anyways!

    I’ve stopped myself from asking too many questions now. I still keep learning in my direction. I feel like I’ve managed to find multiple solutions that both obscure and protect myself. I’ve constantly check my logs for months now and the bot is less than I expected in places I expect them to be and completely zero in other places I thought there would be some activity.

    I want to share what I have learned and my experiences but I know I will receive backlash for deviating from the norm.

    I’ve spent a lot of my self-hosting efforts trying to find ways to protect myself with minimal use of third party services, documenting as much as I could only feel afraid to share what I have learned.

    This comment may not be about learning self-hosting as a beginner specifically but the vibe has been pretty damn consistent throughout me learning C++, self-hosting, linux and shell scripting. All things I enjoy but all so full of people ready to talk down to someone who wants to learn.


  • I have three backups. One is my laptop where all the backups initially start. Then that gets copied to a plugin USB SSD. Then another copy goes to my server which has another USB SSD. That means I don’t have an off site backup.

    I don’t have a place to host an off site backup and I’m not comfortable or interested in using cloud services. Instead I just decided that if it all goes up in flames. So be it.

    It’s just data and backups are just nice a convenience. I’ll be upset but there’s more important things in life to worry about.

    I’ve always lived a life of minimalism and to me stuff is stuff. None of it mattered before I was born and none of it will matter after I die. That happiest and most free feeling I ever experienced was when I spent years travelling with only a 34 litre backpack and that’s kind of been my baseline for happiness ever since.



  • My web facing server has just enough packages installed to (kinda securely) host a Caddy and Kiwix docker container to work with my domain name and make a comfortable work environment through SSH. My Pi for my HomeAssistant docker container has less because it’s locked down to just my local network.

    I also wrote my own install scripts so reinstalling everything and getting it back to a running state would take about 15 minutes for each device.

    And I also wrote my own backup/restore scripts that evolved over 3/4 of a year. I use them often so I have confidence in those scripts.

    I personally don’t really care too much. I have multiple ways of dealing with issues for something that’s a hobby to me. Which is why I stick to simplicity.

    I’m sure this is a thing for people to worry about when dealing with more complex setups. I just wanna vibe out in my tiny corner of the internet.



  • I’ve read about that and I already have that in my notes as well.

    It doesn’t really affect my needs because my ISP blocks incoming on those ports anyways. Also I’m choosing not to use a tunnel at the moment so I’ll be using a higher port anyways.

    The last time I asked about it, a few people seemed to agree it was something to do with the firewall settings. That seems most likely since I was able to connect when I disabled my firewall. I’m not a fan of working with iptables. The language for that type of networking is gibberish to me.

    I had also tried going from docker compose to rootful podman compose and ran into the same issue. Although I’m trying to work away from podman compose in the future, just taking it in steps.


  • Yeah, I mainly just want to move away to more open projects. When I first started, everyone kept suggesting using Cloudflare. After half a year using their service, I just felt icky the entire time.

    In the past couple months I was able to move away and chose to protect myself by learning how to harden my server as well as hiding my server behind multiple layers of obscurity.

    With my current setup, the only site traffic I get has only been myself and my custom ssh port only gets hit by bots about 3-10 times a week according to my logs. Only time will tell how effective my layers of obscurity will hold up but so far it seems to satisfy my needs better than I was expecting.

    Once I get podman in a state I like, I’ll pretty much be all open sourced and all I’ll have to do for myself is be in maintenance mode unless I care to add a new service. I like to keep things simple so I don’t normally go crazy adding new services anyways.