Where the blog posts hang out.
Documenting the setup of my Prometheus VM
It runs in a 2 vCPU 2 GB RAM proxmox VM with Ubuntu 20.04. The proxmox host is an old laptop.
Data manager VM for collecting prometheus data that can be ingested by the Grafana instance running in my Home Assistant instance.
docker run --name prometheus -d -p 9090:9090 -v /home/myuser/configs:/etc/prometheus prom/prometheus --config.file=/etc/prometheus/prometheus.yml --web.config.file=/etc/prometheus/web-config.yml
~/configs$ ls prometheus.crt prometheus.key prometheus.yml web-config.yml
global: scrape_interval: 15s evaluation_interval: 30s # scrape_timeout …
Does it work yet?
If you can see this, it means I figured out how to solve the problem with my nginx container being unable to read the media volume from my Django/Wagtail container.
From my nginx logs:
2022/05/10 00:01:28 [error] 34#34: *4 open() "/app/media/images/disable_chat_files.width-800.png" failed (13: Permission denied), client: 188.8.131.52, server: averyuslaner.com, request: "GET /media/images/disable_chat_files.width-800.png HTTP/2.0", host: "averyuslaner.com", referrer: "https://averyuslaner.com/i-broke-my-media-volume/"
The static and media volumes are mounted to /app inside the nginx container. Their permissions:
root@e7b004ccbbb5:/# ls …
An attempt to run a Parrot OS VM on Fedora 35 by any means necessary.
The first blocker encountered during setup of my Parrot OS VM is that the VM creation wizard reports that it "Failed to find a suitable default network."
If I ignore the problem and attempt to hit finish, it tells me it is "Unable to complete install: 'internal error: No <source> 'bridge' attribute specified with <interface type='bridge'/>'".
I'm not entirely sure what device name it's looking for but I'll try …
Setting up postgres with PostGIS for geodjango on Arch.
sudo pacman -S postgresql
Setting up data directory within my /home partition:
mkdir -p /home/myuser/postgresdb_data/data sudo chown -R postgres:postgres /home/myuser/postgresdb_data
sudo -iu postgres initdb -D /home/myuser/postgresdb_data/data
Edit the systemd service file:
sudo systemctl edit postgresql
Add the following:
[Service] Environment=PGROOT=/home/myuser/postgresdb_data PIDFile=/home/myuser/postgresdb_data/data/postmaster.pid ProtectHome=false
Edit /home/myuser/postgresdb_data/data/postgresql.conf to listen exclusively through UNIX sockets and we'll also set password encryption to scram-sha-256:
... listen_addresses = '' ... password_encryption = scram-sha-256 ...
Edit /home/myuser/postgresdb_data/data/pg_hba.conf to …
Updating a repo that is no longer signed.
Err:10 https://dl.bintray.com/rabbitmq-erlang/debian bionic InRelease 403 Forbidden [IP: 184.108.40.206 443] E: Failed to fetch https://dl.bintray.com/rabbitmq-erlang/debian/dists/bionic/InRelease 403 Forbidden [IP: 220.127.116.11 443] E: The repository 'https://dl.bintray.com/rabbitmq-erlang/debian bionic InRelease' is no longer signed. N: Updating from such a repository can't be done securely, and is therefore disabled by default. N: See apt-secure(8) manpage for repository creation and user configuration details.
Seems to be related to the shutdown of Bintray sometime around April. So the first …
Does money grow on hard drives?
This is probably dumb but I'm doing it anyway
Chia is going to be the next big crypto craze. Or at least that's what their website tries to make the case for.
Also, it's just kind of fun to play around with. So that's what this post is about, especially since the HDDs in my NAS are not the recommended method of plotting and farming Chia.
I'll be using the Virtual Machine Manager of …
Easy switching between networks when you don't have a desktop environment
My setup manages its Wi-Fi connection with wpa_supplicant but doesn't have a desktop environment so it's kind of annoying when I want to switch VLANs to manage my IoT devices.
The two networks I'm switching between are defined in /etc/wpa_supplicant/wpa_supplicant-wlp2s0.conf and their priority set by the priority variable. To make switching easier, I've made a script to flip the priorities and restart the wpa_supplicant systemd service.
#!/usr/bin/python3 import subprocess with open('/etc/wpa_supplicant/wpa_supplicant-wlp2s0.conf', 'r') as …
A guide to installing and configuring Mariadb on Arch Linux
This website runs on a Digital Ocean Ubuntu VPS, and uses MySQL as it's backend database. I want to use my Arch Linux laptop to develop and test new code for the website so I need a database to use for the test environment. I could use Docker which would probably be the better option if I wanted to emulate the production database on this VPS as closely as possible but I've never …
Reallocating disk space from home to root partition
On my recent Arch install, I decided to create separate partitions for my root and home directories. Unfortunately, I'm a dummy and only allocated 20 GB for the root partition. As it turns out, that's not enough for a proper workstation setup so now I have to fix my mistake.
My particular install involves two usb drives, one with the Arch boot partition and the other with a cryptographic key to …
Documenting my setup for creating a login screen with a video background.
During my Arch install, I decided to use SDDM as my display manager. One of it's features is the ability to use videos as the background of the login screen so I figured I'd give that a go.
First, I had to decide on the base theme that I wanted to customize. My …
Building our first model.
Training the Model
Now we can actually write the script that will train the model!
# USAGE # python fine_tune_birds.py --vgg vgg16/vgg16 --checkpoints checkpoints --prefix vggnet from config import bird_config as config import mxnet as mx import argparse import logging import os # construct the argument parse and parse the arguments ap = argparse.ArgumentParser() ap.add_argument("-v", "--vgg", required=True, help="path to pre-trained VGGNet for fine-tuning") ap.add_argument("-c", "--checkpoints", required=True, help="path to output checkpoint directory") ap.add_argument("-p", "--prefix", required=True, help="name …
Building the rec files that our model will consume for training
In this post I'll be concentrating on creating the lst and rec files needed for model training using the training data that we generated in Part I of this series.
The model will use the mxnet framework which reads rec files as training data. First we'll create the lst files guided by a config file that we can define now:
The Config File
from os import path # top level path to our bird related …
Making a bird feeder that automatically identifies its visitors.
I have a bird feeder with a Raspberry Pi camera pointed at it. That raspberry pi creates a video feed of the bird feeder accessible on my local network. My local network has a Synology 918+ with a IP camera license. The Synology configured IP camera detects motion from birds (and wind unfortunately), and saves a short video of the motion events to disk.
At first I will manually watch each motion …
Encrypted good; insecure bad.
We first need to install certbot from Let's Encrypt since that will do most of the hard work for us.
sudo apt-get update sudo apt-get install software-properties-common sudo add-apt-repository universe sudo add-apt-repository ppa:certbot/certbot sudo apt-get update sudo apt-get install certbot python-certbot-nginx
Now that certbot it installed, we can grab a certificate. We'll have daphne consume it so we don't need certbot to mess with our nginx file.
sudo certbot certonly --nginx
The tool …
How to setup MySQL for Django in Ubuntu 18.04
Installing System Packages
The first step is obviously to install MySQL itself, which is easiest to do using your systems package repository. In my case, I'm using Ubuntu 18.04 on a Digital Ocean droplet.
sudo apt-get update sudo apt-get install mysql-server libmysqlclient-dev
Securing the Database
Now we can work on some initial setup of the new database server. Conveniently, MySQL provides a tool that walks you thru the basics of securing the installation. The …
Serving a Django web application with Daphne and Nginx
I'll be deploying a Django 3.0 web application on a Digital Ocean droplet running Ubuntu 18.04. The web app will be served by Daphne and Nginx.
I'll be running Django using Python 3.8.1 which isn't currently available as a package on Ubuntu so I'll be installing it from source. The files are available on python.org.
You could use wget to download the files straight from the remote machine, but …
Configuring Ubuntu 18.04 + CUDA 10.0 + NVIDIA GPU For Deep Learning With Tensorflow & OpenCV Python Bindings
Configuring a development environment to perform deep learning using the python bindings of OpenCV.
Guides Exist for Ubuntu 16.04; Less so for 18.04
This guide will essentially adapt existing guides for 16.04 and address the areas where those guides need to be altered or worked around to achieve the desired results. This solution isn't as well supported as 16.04 so if you're just looking for an easy way to get started with machine learning, you should probably look elsewhere. Here we're all about breaking things and figuring out how …
I have no idea what I'm doing so I made this guide to help both of us.
A friend encouraged me to attend the Democratic Party's Caucus Night. To be perfectly honest, I still don't know if that event was organized by the State, County or National Democratic Party, or some combination thereof. Regardless, I consider myself a democrat and I wanted to get more involved in creating change in my local community so I decided to go.
That same friend directed me to vote.utah.gov. There you can enter …
Because just when I figured out Docker Cloud, they dropped support for it.
Production is like development but with more stress and frustration
Just when I had my Docker Cloud deployment all figured out, Docker went ahead and ruined everything by announcing they were dropping support for it in May. So here I am trying to figure out how to use Docker Compose in production. Luckily, I already had a working implementation of Docker Compose for development. Unfortunately, docker didn't want to make things too easy for …
Deploying my docker containers with Docker Cloud.
Like so many of my projects, I'm coming back to this one after several months of not touching it so I need to reorient myself a bit and then continue stumbling forward until I end up with something that works. The end goal is to be able to deploy a series of docker containers that make up a web application using Docker Cloud.
So far I have three docker files; one for Postgres, …
Smartwatches are worth it. The FitBit Blaze is not.
Are Smartwatches Even Worth Buying?
My experience with smartwatches began with the original kickstarter funded Pebble which I received in March of 2013. Since then, I have considered smartwatches to be a useful accessory that I prefer to have with me as much as possible.
Here are a few reasons why:
- Massively useful while driving
While driving, I can monitor incoming texts, emails, and calls with a simple glance of my wrist. I essentially don't …
Because doing things manually is lame.
Building a Server Monitor
A few days ago I realized how nice it would be to have a web interface to manage my various servers. So today I'm going to start building one. I'll also be using this as an excuse to try out the Python web framework Pyramid for the first time. As usual, this post will be less of a guide, and more of a record of my fumbling through the void that …
Protecting sensitive information when using docker-compose files in a public repository.
Hide Yo Passwords!
If you have multiple containers in a swarm service, this won't be the path for you. If you do have a docker swarm service, check out docker secrets.
I however, don't have a swarm, and I don't see any benefit in converting my setup into a swarm of size one, as the docker secret docs offer as a suggestion. So instead, I'll be using good ol' environment variables. The problem however, …
How to import a .sql dump file into a docker container created by docker-compose.
Get in there!
It took me a while to figure out how to properly import a .sql dump file into a docker container I had started with docker-compose. The recommended way was to import the data on creation of the container by adding the dump file to the /docker-entrypoint-initdb.d directory. I wanted to avoid this if possible because the dump file is a backup of a production database but the docker-compose file is shared in …
Let's check the news.
Building a Dataset
We were introduced to a few basic concepts in part one of this series, but going forward it will probably be easier to start experimenting with data that is actually relevant to the problem that we want to solve. As refresher, that problem is scan news articles and determine if they are relevant to natural resource conservation.
To start simple, we'll grab two articles; one about some sort of ecological conservation …
Bring on the learning. Machine style.
I'll mostly be following along with the online book Natural Language Processing with Python by Steven Bird, Ewan Klein, and Edward Loper while referring to Machine Learning in Python by Michael Bowles, but I'll be aiming to apply the lessons to problems that interest me personally. You can find the code associated with this blog series on my GitHub.
The ultimate goal of this machine learning practice is to produce a binary …
How to make a video from a bunch of images.
Getting things in order
For one of my nonprofit's projects, I used a DSLR camera on a tripod to capture the transformation of a common garden green bean seed from germination to small seedling. If it interests you, you can find the code on GitHub here.
Unfortunately, an earlier iteration of the code I wrote to control the camera did not account for the fact that the script may need to restart at one …
Because I'm willing to give myself a major headache for small gains.
As I said in my last post, I'm aiming to add the ability for users to log in to the website using django-registration. It should be pretty simple, especially considering I got it working just fine on my nonprofit's website. Unfortunately, this website is essentially all about creating silly problems for myself so I can learn by fixing them. And thus, I have run into a problem that I did not …
Because you can't have progress without a whole bunch of annoying things trying to stop you.
And we're back!
I'm finally coming back to my personal website so I can make it less bad. And more good.
I've spent the better part of two days updating all of the packages on the VPS as well as those in the python development environment. It's kind of just fun to update things. It makes me feel better.
I refactored a bunch of the project code to better handle things like file …
Securely backing up a remote mysql database to a local machine.
Ok so now that we have our secure backup location, it's time to actually grab everything from the database. The first command will allow us to securely communicate with the remote server over SSH. If we didn't do this, all of the website traffic would be vulnerable to interception since it would be unencrypted in transit (with the exception of the user passwords which are stored encrypted already).
ssh -f -L3310:localhost:3306 firstname.lastname@example.org -N
You gotta encrypt all the things.
As of the time of writing, the website still looks crumby. But there has been progress! You just can't enjoy it yet.
Considering I implemented the strictest level of encryption on my website before I even built a way for users to submit data, I obviously care about security. In fact, I probably approach security in way that could be construed as paranoia, but that's just because I think such an approach offers the greatest …
There's a lot of stuff to do behind the scenes when building a website from scratch. And it takes up a lot of time.
As with most computer related things it seems, working on one problem always seems to expose a variety of new problems, some of which are blocking while others are just interesting optimizations. In the age of Squarespace and other such tools that allow one to erect aesthetically pleasing websites in a matter of minutes, part of the fun of this project is to tackle as many of the challenges I encounter head on. As such, …
Outlining my plans for the new website.
This website will serve to showcase my various projects and interests.
Various topics and categories will be explored:
For the foreseeable future, I will mainly be writing about programming in Python and various web-focused languages while I build out this website. The site is hosted by DigitalOcean* on an Ubuntu 16.04 x64 instance and powered by Django and Wagtail. It is served by the nginx web server. The source …