Advanced Deploy Scripts for Django with Venv

2016-03-26 | #bash, #django, #solution, #webdev

A while ago I wrote a piece about how to simply deploy a Django project with rsync without migrations (south was still prevalent then). If you search for "deploy" you'll find it easily. In the meantime I build a somewhat more sophisticated git-based setup I'd like to share.

What I did here is a little different then before. First of all I divided the logic into "init"-scripts which perform basic tasks of installing/updating and building and "deploy"-scripts which perform deploys for different servers. With this structure it does not matter if you want to perform a single task again, update the server while being logged in or if you want to do a remote update via ssh. So at the end we have a little script collection, which, when everything comes together, performs a complete deployment including building and migrations.

The first script is an update script for all dependencies to back- and frontend called "init/install-all-packages.h":

#!/usr/bin/env bash

cd "${0%/*}" && \

cd .. && \

pip install -r requirements.txt && \

cd frontend/static_base && \

npm install && \

bower install

The first line switches into the script's location dir so we'll have a fixed starting point. After that we'll switch relatively to the project dir update everything for Django via Pip, then switch to the frontend's build dir (which could very well also be the project dir, but isn't in this case :) and update node and bower packages. After this we should be up to date, given that the package lists are properly managed. A little warning, this script expects you to activate the fitting venv beforehand, since this should be used generally in dev and for deployments, where venv configs may differ. This task is done by the deploy script later, but you should keep that in mind, since a call without active venv will pollute your global Pip.

The second script is the frontend build script, which provides fresh asset files for the client directly from the source. This script is called "init/build-frontend.sh"

#!/usr/bin/env bash

cd "${0%/*}" && \

cd ../frontend/static_base && \

gulp build

Here you should trigger your frontend build job, which in this case is a standard gulp build in a separate frontend build dir.

The third and central script defines the whole deploy chain as seen from the server with a logged in user, so could just call this if you are already logged in via ssh. This code lives in "deploy/update.sh".

#!/usr/bin/env bash

cd "${0%/*}" && \

cd ../../ && \

. venv/bin/activate && \

cd project && \

git pull && \

./init/install-all-packages.sh && \

./init/build-frontend.sh && \

python manage.py dbbackup --database default && \

python manage.py migrate && \

python manage.py collectstatic --noinput && \

python manage.py clearcache && \

touch lorealccg/wsgi.py && \

sudo supervisorctl restart foobar-job

We again start at the script's own location, then activate the correct venv on the machine for the current project, then switch into the project dir. Here we pull the newest source from our git repo (this being the main difference to the simple solution, where we pushed a complete state already build from a dev machine). After the update we can process package updates with our script from before. With all packages up to date we can build the new frontend with our other utility script. Now for the DB-part. First dump an automatic backup before proceeding (in this case with "django-dbbackup"), then migrate via the migrations present in the repo (this proved to be extremely robust since migrations are part of the Django core). Now that everything's new, shiny and ready, collect the static files for delivery by the webserver, clear the Django cache (you'll have to write that command) and touch the wsgi.py to trigger a python source reload. If your using something like gunicorn with supervisor you'll also have to restart the job. On Nginx and Apache you can leave that out.

To be able to trigger this whole shebang remotely one last script to the rescue, named "deploy/deploy-to-yourservername.sh".

#!/usr/bin/env bash

ssh username@serverdomain.com '/path/to/your/project/deploy/update.sh'

What this does is not so hard to guess. I guess.

And voila: remote deployment with updates, builds, migrations, backup and restart all in one command. Hooray.

From here you could parametrize the scripts for different servers or just copy and paste to build one deploy process for each target. Easy as pie.