Deploying AWS Lambda functions
Table of Contents
Why?
One would imagine that we could compile our Python, Node, or Java projects into a ZIP file and upload them to AWSm Lambda without issue. Unfortunantly AWS Lambda’s execution environment stipulates everything should be compiled in the runtime environment. Makes sense - compile under the same criteria you’re running under.
When I did this manually my workflow was:
- Create an Amazon Linux instance
- Install git
- Commit code
- SSH into EC2 instance and do a
git pull master
- Run a shell script to package my Python code and deploy it to AWS Lambda
Pretty manual process - and I totally forgot to turn off my ‘build’ EC2 instance - accruing some charges for idling. This process looks ripe for automation!
Getting Started #
If you haven’t already, install VirtualBox and Vagrant.
Vagrant plugins #
vagrant plugin install vagrant-aws [<i class="fa fa-github"></i> vagrant-aws](https://github.com/mitchellh/vagrant-aws)
vagrant plugin install vagrant-env [<i class="fa fa-github"></i> vagrant-env](https://github.com/gosuri/vagrant-env)
vagrant plugin update vagrant-omnibus [<i class="fa fa-github"></i> vagrant-omnibus](https://github.com/gosuri/vagrant-env)
Chef & Private Repositories
By default, Chef will try to clone the specified git
URL as a public user. That won’t work when we’re trying to checkout form a private repository over SSH:
aws_lambda recipe #
git_client 'default' do
action :install
end
directory '/etc/git' do
owner 'vagrant'
group 'vagrant'
mode '0755'
action :create
end
git '/etc/git/__ repo name __' do
repository "[email protected]:__ github username __/__ private repo__.git"
reference "master"
action :sync
end
We’ll need to give Chef read access to our repository over SSH by forwarding our SSH keypair over to Chef. First lets enable SSH forwarding so Chef and Vagrant can borrow our host SSH keys with
SSH forwarding fails during provisioning #
==> default: ================================================================================
==> default:
==> default: Error executing action `sync` on resource 'git[/etc/git/magicka.io]'
==> default:
==> default: ================================================================================
==> default:
==> default: Mixlib::ShellOut::ShellCommandFailed
==> default: ------------------------------------
==> default: Expected process to exit with [0], but received '128'
==> default: ---- Begin output of git ls-remote "[email protected]:__ github user __/__ git repo __.git" "master*" ----
==> default: STDOUT:
==> default: STDERR: Host key verification failed.
==> default: fatal: Could not read from remote repository.
==> default:
==> default: Please make sure you have the correct access rights
==> default: and the repository exists.
==> default: ---- End output of git ls-remote "[email protected]:__ github user __/__ git repo __.git" "master*" ----
==> default: Ran git ls-remote "[email protected]:__ github user __/__ git repo __.git" "master*" returned 128
==> default:
==> default: Cookbook Trace:
==> default: ---------------
==> default: /var/chef/cache/cookbooks/compat_resource/files/lib/chef_compat/monkeypatches/chef/runner.rb:41:in `run_action'
This is because Chef doesn’t acknowledge the passed over SSH keys. We can fix this by allowing github.com as a trusted host (it’s an active issue):
Vagrantfile #
# SSH configuration
config.ssh.forward_agent=true
config.ssh.insert_key = false
config.ssh.private_key_path = ["~/.ssh/id_rsa", ".vagrant/machines/default/virtualbox/private_key"]
# prefer AWS over virtualbox
# config.vm.provider "aws"
config.vm.provider "virtualbox"
config.omnibus.chef_version = :latest
# SSH forward doesn't work while we're provisioning
# https://github.com/mitchellh/vagrant/issues/1303
config.vm.provision :shell do |shell|
shell.inline = 'echo -e "Host github.com\n\tStrictHostKeyChecking no\n" >> ~/.ssh/config'
end
Make sure your id_rsa
is a known key to Github so it’ll allow Chef to checkout our repo.
Rerun vagrant reload && vagrant provision
and you should see your private git repository checked out to your Chef’d environment.
From here it’s up to our Chef recipies to do the rest of building our code, running tests, etc.
Deploy Python code to AWS Lambda
As of this writing, AWS Lambda has a primitive deployment pipeline involving either in-line code or tarballed code drops uploaded to S3. To automate this, we’ll create a zip of all the installed dependencies and source code and upload that deployment artifact directly to AWS Lambda.
package_and_deploy_lambda_function.sh #
#!/bin/bash
# based off of Amazon's official documentation:
# http://docs.aws.amazon.com/lambda/latest/dg/with-s3-example-deployment-pkg.html#with-s3-example-deployment-pkg-python
# get the lambda function
lambda_func_file=$1
lambda_func="${lambda_func_file%.*}"
# exit if no file specified
[[ -z "$1" ]] && { echo "Lambda function is empty" ; exit 1; }
# generate a deployment timestamp
timestamp=$(date +%Y-%m-%d-%H:%M:%S)
# create a deployments directory if one doesn't already exist
mkdir -p ${PWD}/deployments
zip_file="lambda-func-$lambda_func-$timestamp.zip"
zip_path=${PWD}/deployments/$zip_file
# ignore folder path to site-packages but preserve the structure inside of site-packages
pushd $VIRTUAL_ENV/lib/python2.7/site-packages/
zip -r9 $zip_path .
popd
pushd $VIRTUAL_ENV/lib64/python2.7/site-packages/
zip -r9 $zip_path .
popd
# add Python Utils sub-modules
pushd $VIRTUAL_ENV/utils/
zip -r9 $zip_path .
popd
# add Python code
zip -9 $zip_path $lambda_func_file
# echo back the deployment ZIP for further processing
echo 'deployment ZIP: ' $zip_path
# deploy to AWS Lambda
aws lambda update-function-code --function-name $lambda_func --zip-file fileb://$zip_path --publish
A deployment would look like this:
./package_and_deploy_lambda_function.sh __ .py file __
such as
./package_and_deploy_lambda_function.sh hello-world.py
and the python file corrsponds with the AWS lambda function of the same name.