Curse of the Polyglot Most Jenkins environments I’ve seen are very homogeneous - a Java shop, a Ruby shop, etc. My projects are anything but. Looking at my machine I’ve got:
Java 6, 7, 8 Python 2.7, 3.5 Ruby 2.2.3 Keeping track of each of these versions on the Jenkins server would be a pain. Its easier to containerize the build environment.
Installing Docker To build and start our build container we run the following commands:
Configure EC2 I opted for an t2.small Ubuntu 14.04 EC2 instance. This gives us two gigs of RAM and a single core for about .02 cents an hour. This should serve our purposes for a personal CI server.
Elastic IP Address I elected to not assign a public IP address / Amazon DNS as this will change everytime the server reboots. Instead I allocated an Elastic IP address (under Network & Security).
Coming back to the fold While setting up my own git server was a fun exercise, the truth is no open-source contributor or prospective employer will go hunting the web for your personal repository.
In short, at least having your code on Github greatly increases the odds of your work seeing the light of day. To be sure, Gitlab has some great use cases that, as a single developer, I just don’t encounter:
Assembling Firmware Installation Install dfu-programmer:
Linux: sudo apt-get install dfu-programmer
Mac: brew install dfu-programmer
Python 2.7+ to run Easy AVR
Steps Startup Easy AVR ./EasyAVR/easykeymap.sh & <-- run on a background thread
Make modifications to the .dat file
Run File > Save Layout As…
Run File > Build Firmware
Hold PCB reset button for 10 seconds
Run these commands:
Getting Started To begin, install Chef’s omnibus installer: Chef DK
Chef uses the chef tool for creating cookbooks, downloading cookbooks, and many other things.
chef generate repo chef-repo
This creates a chef-repo for connecting to Chef Server.
While this is great for getting started with Chef, creating cookbooks and keeping them in version control gets weird, since this is a single repo.
Cookbooks Chef Cookbooks are composed of recipes written in Ruby plus a Chef Ruby sugar for automating installing an application like Redis, Postgres, Solr, etc.
Amazon Certificate Manager Since we’re hosting our site on Amazon’s S3 - we can take advantage of some other AWS products to secure or website, namely:
Amazon Cloudfront Amazon Certificate Manager These services give us a few useful features for our simple blog:
Free SSL/TLS certificates managed by Amazon (Using Let’s Encrypt with S3 is awkward since there is no traditional web server to secure)
Global CDN - Cloudfront gives us global availablility for our S3 assets - people around the world get a fast user experiance.
Protip Want to experiment with a JS library?
Many JS libraries attach their API right on window.
Head over to Moment.js (a fantastic library for time and dates) or Underscore.js and open up your dev tools. voila - instant sandbox.
From here you can experiment with the API right on your browser with the documentation right in front of you.
Going Live After Building a Blog Part 3 - Continuous Integration with Gitlab CI - its time to automate deploying our blog to a host.
Amazon’s Simple Storage Service (S3) is a good choice for serving static assets on the cheap. S3 is a bit different from a traditional VPC like DigitalOcean or Amazon’s Elastic Compute Cloud (EC2).
Pros Very cheap - With AWS you pay as you go. Rather then paying $5-10 a month running a server for our static site, we can store our site for fractions of a penny:
Why CI a simple blog? Just like we write automated unit tests for our applications, anything we’re pushing to the Web should be stable and we’re confident we didn’t break our site with an errant //JS or <HTML> tag. To do this, we’ll attach an integration service listening to our commits to the master branch.
Choosing a Platform Since I’m running my blog outside of Github Pages, I can’t depend on integration servies such as Travis CI and Github’s ecosystem of integrations.
Motivation Looking to create a twitter bot - I settled on a simple Python script for scaping some data and publishing it to a twitter account twice a day. This bot would scrape NOAA and the OpenWeatherMap API for the combined air and water temperatures. Scrapy is a fantastic library for this task. Looking to create a twitter bot - I settled on a simple Python script for scaping some data and publishing it to a twitter account twice a day.