I finally sorted out my self-hosting stuff

󰃭 2024-07-20

I think. Infodump commencing…

My previous setup

My previous setup included running nirn.quest services on a $30/month SeedHost.eu dedicated server, and my hyperreal.coffee services in a virtual machine in my homelab. For the latter, I had Linode managing the DNS for my hyperreal.coffee domain, and the domain itself pointed to a $5/month Linode VPS, which used Tailscale to forward HTTP and port 1965 traffic to the virtual machine in my homelab.

I originally got the $30/month SeedHost.eu dedicated server with 8T of storage to use as a seedbox for Sci Hub torrents. I later decided to get a dedicated app hosting package for $173/month, which included 72T of storage and a 10Gbps network throughput speed. I figured this was better for torrenting Sci Hub. I then repurposed the $30/month dedicated server and started running nirn.quest services on it.

With the previous setup, I would be spending $30 + $173 + $5 = $208/month.

My current setup

The previous setup was fine until I found a $18/month dedicated server on netcup.de, which has better specs minus the 8T storage space. I don't really need the 8T anymore, since I now have a dedicated app hosting package on SeedHost that can store the entire Sci Hub collection. This netcup.de dedicated server has 256G storage which is plenty for my use-case. On this new dedicated server from netcup.de, I can host everything I had on hyperreal.coffee as well as nirn.quest. So I decided to start using only my hyperreal.coffee domain, which simplifies things like DNS and web server configuration. The nirn.quest domain will expire in a few months, and it was super cheap--like $3 if I recall correctly–so I'm not too sore about leaving it but for the cool domain name. The netcup.de dedicated server is located in Manassas, Virginia, US. Now that everything is being hosted there, I no longer need Linode to manage my hyperreal.coffee domain, nor do I need a Linode VPS to act as a relay between the Internet and my homelab virtual machine. My monthly expenditure will now be $18 + $173 = $191/month.

My Mastodon instance uses ElasticSearch for full-text search. When the ElasticSearch daemon runs, it automatically allocates half of the available RAM on the host. To conserve the total available 16G of RAM on my remote server, I have a Tailscale connection between a virtual machine on my homelab running ElasticSearch and my new remote netcup.de server where my Mastodon instance is running. On my homelab virtual machine, Firewalld is configured to only accept connections to port 9200 from the remote server's Tailnet IP address.

Backups

I'm using Wasabi object storage for backups of Mastodon and important files on the remote server. I've written a backup script that utilizes rclone to sync and copy the files to the configured Wasabi remote. I'm currently trying to figure out how to properly setup using a Wasabi bucket for Mastodon S3 object storage. I'm pretty sure I have everything configured as it should be, but when I set S3_ENABLED=true and reload Mastodon, the avatars and attachments are blank, and I get S3 XML AccessDenied error messages when attempting to view resources in their own tab. According to Getting Mastodon working with Amazon S3 file-hosting, this means the permissions on either the files or the bucket are not set correctly. When using awscli to sync the public/system/ folder over to the bucket, I made sure to use the --acl public-read option, but this didn't seem to work. I wondered if it is because Wasabi does not enable public access to the buckets by default, so I emailed Wasabi support, and they suggested a more secure work-around that doesn't seem to fit with my Mastodon use-case. So, until I can get this sorted out, I'm using local filesystem storage on my instance.

Wasabi costs $6.99/TB/month, which isn't too bad when I have less than 1T of data stored there. Eventually, I'm going to ask SeedHost.eu if they can offer MinIO as an app option for dedicated app hosting. Then I would be able to use some of the 72T for S3-compatible object storage and not have to pay extra for Wasabi.

Sci Hub torrents

I'm going to peruse the awesome-libgen repository for things I can use to help Sci Hub and the broader open access movement. I had an idea to write a Python script that (1) checks for which Sci Hub torrents need seeders and (2) configure my qBittorrent instance to prioritize those torrents.

Other notes

Since I don't pay the electric bill, I've been making it a point not to use too much energy with my computers and homelab equipment. I'm not currently using my OPNsense device, and my homelab is now basically just my TrueNAS machine + my remote dedicated server. My Orange Pi 5+'s and Pine64 devices are currently out of commission. Someday soon when I get a job, move to a place of my own, and pay my own electric bill, I will start using them again. My daily driver setup right now includes my workstation PC, my gaming PC, my laptop, my smartphone, my ISP router, a 2.5Gbps Ethernet switch, and my TrueNAS machine. The gaming PC and laptop go into sleep mode to conserve energy when I'm not using them.


Enter your instance's address