brought to you by REVSYS
Funny dog saying, Silly human! Why you keep doing that?

Automate Common Tasks

As software developers we talk about DRY quite a bit. We strive to not repeat ourselves in code, because doing so makes fixing bugs and adapting the code in the future more difficult. So it's strange we don't apply DRY principles to more of our daily work.

How many of these seem familiar to you:

  • history | grep that-one-command-you-always-forget
  • ssh me@... hmmm what was that hostname again? Oh yeah how could I forget
  • workon my-project; cd ~/path/to/project
  • cd ~/path/to/project/; ./ that_one_thing --dry-run --port=6439 --commit
  • Sit down at desk, launch editor, launch 3 shells, launch team chat tool, launch browser tab to github repo, open browser tab to local dev server, cd ~/project, git fetch... and now start working

We all have certain patterns we use on a regular basis. Certain work flow steps that pop up daily or weekly that are ripe for automation, but are relatively small and easy so we don't feel the need to script them up.

If they were 5+ complicated steps, with a dozen command line options and needed to be done hourly, we would likely have automated them already. But they're 2-3 commands and we only do them every so often so it often seems automating them is overkill.

Yet, like most performance related things, small things add up quickly. Every moment wasted referencing that wiki page for the 10th time or that typo you seem to always make robs you of some possibly productive time.

Having to repeatedly turn your attention to these tasks keeps you from entering or staying in a flow state. Killing your overall performance with a death by a thousand cuts.

Two cats in christmas hats with caption 'Help us Santa-wan Kenobi, you're our only hope'

What to Automate

In general, if there is something you do daily or weekly that is multi-step it's worth considering. The XKCD comic has a great chart to help gauge the pay off of making a task more efficient.

It uses a 5 year span of time, but let's take more of a short term look. If the task at hand can be cut down by 30 seconds and it's something you do daily, you can spend nearly 2.5 hours automating it. You would break even on this investment in one year, not taking into account any benefits from flow or extra time spent making silly mistakes like typos occasionally throughout the year.

We all have frequent tasks that only take 30 seconds to a minute and would take considerably less than 2.5 hours to automate. So we're all wasting time.

Automation Tools

The easiest thing to do for most of us is to use simple shell aliases. Assuming you're a bash user, small things like:

alias clear-pycs="find $PWD -name '*.pyc' -delete"

To find and remove all .pyc files from your current directory and deeper are handy. But you can get even more advanced.


Most of us use Python virtualenvs and virtualenvwrapper already. However, if you aren't using virtualenvwrapper this should convince you to use it.

One of the lesser utilized features it provides is per-virtualenv hook scripts that are run when activating or deactivating a particular virtual environment.

If your virtualenvwrapper is setup to use the default ~/.virtualenvs/ location for where your virtualenvs will live these hooks can be found in ~/.virtualenvs/<NAME>/bin/. You're looking for scripts named 'postactive' and 'postdeactivate', but you'll see there are also scripts for before activation and deactivation if you find a need for them.

Let's do a simple example first. One of the common tasks getting started in a project is activating the virtualenv and then cd'ing to the repository checkout directory to start coding or working with the code. This is a very repetitive, often executed task we can automate. Assuming we have a project named 'santa-wan' in ~/virtualenvs/santa-wan/bin/postactivate we would place the following code:

cd /Users/frank/work/src/santa-wan

Now any time we type workon santa-wan, where ever we happen to be located on the file system, our virtualenv is activated and we immediately cd into the project directory to get down to business.

Often we want certain environment variables set on a per project basis, these hooks are a perfect way for that to happen. So let's do this:

export PROJECT_PATH="/Users/frank/work/src/santa-wan"


... in our postactivate hook. We now not only cd to the right location on startup, but also define our Django settings module and Python code to be run anytime we startup a Python shell. Now we need to clean up after ourselves, so these environment variables are unset when we deactivate. So in the postdeactivate hook we need:


Python virtual environment are really just a collection of shell hacks. So you can use a virtualenv for non-Python "projects" just to have these 'postactivate' and 'postdeactivate' hooks at your disposal.


Personally, if the task gets more complicated than the meager bash shell aliases we've been using above, I reach for Fabric. Fabric is a wonderful Python library for automating your command line. It's geared toward issuing commands, over ssh, to several hosts based on roles and tasks but can easily be used entirely locally.

One thing that comes up frequently in Django development is blowing away your local database and recreating it. With PostgreSQL and Django 1.7, it's a simple three command process, but it pays to be lazy so let's automate it with Fabric. If we put our in the root of our repo next to our that would look like this:

import os
from fabric.api import task, lcd, local
from fabric.contrib.console import confirm

BASE_DIR = os.path.dirname(__file__) # Get our repo directory path

def rebuild_db():
    msg = "Are you sure you want to blow away your local 'santa-wan' db?"

    if confirm(msg, default=False):
        local("dropdb santa-wan")
        local("createdb santa-wan")
        with lcd(BASE_DIR):
            local("./ migrate")

Now we recreate our db by issuing a simple fab rebuild_db with some safety of having to confirm the action to avoid mistakes.

Fabric is perfect for these kinds of common tasks, here are some ideas you may find useful:

  • fab deploy – Tag repo with version, push tag, ssh into various servers updating everything
  • fab test – Run your test using your typical arguments
  • fab makedocs – cd into Sphinx docs directory, rebuild them and open index.html in your browser
  • fab copy_db – ssh into production system, snapshot the database and restore it into your local development database

Honcho / Foreman

Many of your projects likely require certain services to be running locally. While Vagrant is a popular setup, if you want to run things on your local OS Foreman is a great utility for starting and stopping these services. It's used by systems like Heroku and there is a non-Ruby compatible Python version called Honcho.

Let's assume for our Santa-Wan project we need Redis, a Celery worker, and we want to run Flower to monitor our local Celery setup. In our project root we can setup a file named 'Procfile' like this:

redis: redis-server $PROJECT_PATH/configs/redis.conf
worker: celery -A santa worker -l info --concurrency=1
flower: celery -A santa flower

Now when we get to work each morning on santa-wan, we've cut down our startup from something like:

workon santa-wan
cd /Users/frank/work/src/santa-wan
redis-server ./configs/redis.conf &
celery -A santa worker -l info --concurrency=1 &
celery -A santa flower &

down to:

workon santa-wan
honcho start

And we're ready to start coding.


Tired of typing ssh dev-user-1@ -p 4001 because your ops guy doesn't want the bastion host to have DNS and runs ssh on a non-standard port? Does your username differ between Project X and Project Y servers?

Quit wasting time trying to remember all of that or cut-n-pasting from the notes you took, use your ssh config! Instead of ssh dev-user-1@ -p 4001 just put this in ~/.ssh/config:

Host dev
User dev-user-1
Port 4001

Now you can just type ssh dev and get to work. You can set things like which identity file to use and anything else about your ssh setup on a per host basis or for ranges of IPs and hostnames with wildcards. It can be incredibly powerful. Just figure out what/how you want things to happen once and reap the rewards every day.

OS X Tools

I used to spend a bunch of time as I switched between projects or when getting started for the day launching apps and adjusting them on my monitors the way I like. This was repetitive and wasteful, so I automated it.

Look into tools like these to help automate non-Python/terminal based workflows you may have:

  • Divvy – Quickly resize and restore windows to certain sizes/locations
  • Keyboard Maestro automate nearly anything on OS X without having to write AppleScript
  • Alfred App is also worth checking out. It has a rich plugin ecosystem you can take advantage of
Christmas socks

Reap the rewards

Hopefully you picked up a few new tricks for your stocking this year. Now lean back and kick your feet up with all that extra time you have! Happy Holidays!