Automate all the things!
Whenever the discussion comes up if a task should be automated, it is nearly certain that someone will post the xkcd automation comic. The comic raises a fair point - automation usually does not come for free. You have to write the code. You have to maintain it and at some point there will be changes required to keep up with the system. Despite all of this, I am a strong believer in automating everything possible.
Deploying my blog involves a few steps. Building a Docker image (this is the part I am not a fan of, but it makes things a lot easier). Copying it to my server, loading it, restarting containers and pruning images, so it takes longer for the server to run out of disk space.
For a one server deployment this is good enough. No reason to run a Docker registry. The image produced by Docker build can easily be tested before deploying it, I much prefer that than just running a Dockerfile on the server. All I really need to host a blog. Way more than I would like to actually.
This sounds easy enough and only requires a few shell commands. I still have two make targets, build
and deploy
.
build:
docker build -t registry.home.arpa/sams
deploy:
docker save registry.home.arpa/sams > /tmp/sams.tar
scp /tmp/sams.tar chonker1-sams:~
rm /tmp/sams.tar
ssh chonker1-sams 'docker load < /home/sams/sams.tar'
ssh chonker1-sams 'rm /home/sams/sams.tar'
ssh chonker1-sams 'docker stop sams'
ssh chonker1-sams 'docker rm sams'
ssh chonker1-sams 'docker run -d -t -v /home/sams/db.sqlite3:/db.sqlite3 -p 8001:8000 --restart always --name sams --security-opt apparmor=unconfined registry.home.arpa/sams:latest'
ssh chonker1-sams 'docker image prune --all --force'
Do you know how often I would remember to prune images? Or how to save and load a Docker image via an archive? I deploy this blog once a quarter at best.
Someone once asked me a good question: "It sounds like we will only do this once or twice a month, is it really worth writing a script?" And the answer is "yes".
If you have a task you run multiple times a day you will save time with a script. If you have a task you run once every full moon you will reduce potential human error. Especially if you rely on copy and pasting commands into a production shell.
Automation is not necessarily meant to save time. It is a safety net. If I have a production system which has some known tasks you might sometimes have to run, then they will be a script. Need to clean Redis cache keys? Script. Delete old database partitions after the quarterly archival process? Script. I am not necessarily talking about full blown, unit tested scripts. Sometimes make
is good enough. Just a collection of shell commands.
If your runbook to troubleshoot and resolve production issues contains more than one line to execute – turn it into a script. It is far easier to type make redis_reset
than using redis-cli
, scanning for the correct keys and deleting them before calling an API endpoint to refresh the cache.
Data exports you would only do once or twice a year? Script. python manage.py export_random_ops_metrics
and you are done. File uploaded to S3, email with the file name sent to ops@your-company.tld
. You could obviously do this manually, but it will require more mental energy and care than updating export_random_ops_metrics
once a year to include a new database column.
Does automating things take time? Yep. Will you have to do changes and maintenance? Absolutely. Is it worth it? Yes. Automate all the things!
>> posted on July 18, 2023, 1:11 p.m. in software engineering