DevOps @ LogicGate As a backend engineer at LogicGate I’ve been tasked with creating a DevOps workflow for provisioning our applications on AWS in an automated way. Previously, we used a combination of CloudFormation templates, shell scripts, and a bit of Node. I wanted to a process that makes it easy for developers to test on a laptop but also easy to deploy to EC2 instances.
A foray into DevOps Before getting into DevOps formally I used a combination of Chef scripts, shell scripts, and Vagrant to do my DevOps.
Past Method When deploying a Jekyll blog, is was quite involved: Deploying Jekyll. Precious gems were used and a full development environment was required. If I’ve learned nothing else since then, it has been to simplify deployments as much as possible. The less coode the better.
Cardinal Rule of Deployments: They Always Go Wrong.
While CI is great when you have a multitude of projects and developers - having one for pet projects is just overkill.
Create a Theme Simple as hugo new theme [name]. Documentation can be found here: Hugo Themes
Hugo, due to it being only a few years old, has fewer themes compared to Jekyll
Luckily, it’s expected that you will want to port a theme to Hugo. I decided to port this blog’s theme: Jekyll Now, to Hugo.
Jekyll Now A very popular theme jekyll-now (3,701 , 15,076 ), I wanted to retain the spirit of Jekyll Now while updating it’s dependencies as the last commit was over a year ago.
Migrating from Jekyll Why go to the trouble? While Jekyll has been good to me and shown me the light of static blogging, Jekyll’s unabashed Ruby / Ruby on Rails underpinnings lead to some interesting problems:
1. Ruby version / Ruby env problems
Recently update your operating system? Via brew pr another tool? Chances are your Ruby environment will complain. Since I’m not a ruby person by trade, I haven’t devoted the time to running ruby well.
Github You can find the Github repo here: musicbrainz-mysql
What is MusicBrainz? It is the largest and most complete catalog of all musical releases and artists. I wanted to play with the data and am working on MySQL these days. Since the canonical MusicBrainz database only works on PostgresQL currently, I’ve taken the inititive of making MusicBrainz run on MySQL.
Deploying MusicBrainz to MySQL Tested on MySQL 5.7
Download a copy for your respective OS here: Download MySQL if you haven’t already.
AWS API Gateway Amazon Web Service’s scalable API offering: API Gateway is an execellent way to create simple APIs that charge by-the-request. One pain point though is that by default, your API is available via an ugly auto-generated URL such as:
https://4ittmt4ei7.execute-api.us-east-1.amazonaws.com/dev/ There is an option to manually upload an SSL certificate to enable a custom domain but its tedious and well, manual.
Well that just won’t do.
Unfortunantly, as of this writing, API Gateway can’t be connected to Cloudfront - this is how one would apply a custom domain name to an S3 bucket, for example.
Setup Indexing 3+ million IP addresses
In the next few blog posts I’m going to go into detail on creating an IP mapping service: IPbot. One of the first challanges was being able to efficiently query three million IP addresses and their associated metadata.
Using MaxMind’s free GeoLite data:
GeoLite2 IP databases
I got a set of some three million addresses and their associated metadata.
My first approach used SQLite for its ease of set up and ingestion but I discovered it took quite a long time to return a single record.
Road Map I wanted to automate the tedious and error-prone process of creating a Redis instance that I could reuse when going between my laptop and a remote serer. I settled on writing an Ansible playbook for deploying a secured and optimized Redis instance.
Based originally on DigitalOcean’s guide to deploying a Redis instance:
How to install and configure redis on Ubuntu 16.04
This playbook looks to automate the tedious setup process by performing the following tasks:
Why? One would imagine that we could compile our Python, Node, or Java projects into a ZIP file and upload them to AWSm Lambda without issue. Unfortunantly AWS Lambda’s execution environment stipulates everything should be compiled in the runtime environment. Makes sense - compile under the same criteria you’re running under.
When I did this manually my workflow was:
Create an Amazon Linux instance Install git Commit code SSH into EC2 instance and do a git pull master Run a shell script to package my Python code and deploy it to AWS Lambda Pretty manual process - and I totally forgot to turn off my ‘build’ EC2 instance - accruing some charges for idling.
Enabling TFA Its considered best practice now a days to enable two-factor authntication for your bank accounts, AWS accounts, and now your Github repositories as well.
Two-Factor authentication gives your accounts extra security by enforcing two secret pieces of information: your password and a temporary code that rotates every few minutes.
Enabling two-factor authentication is as simple as updating your Github profile: Settings > Security.
Github Two Factor Authentication
I use Google’s Authenticator app to generate keys for me: Google Authenticator