node.js database migration

When you make your career using FOSS software, you’re always itching to some day give back to this awesome community of individuals that gave you that software. Here at spriteCloud, we’re no different.

As an individual, all you can do is spend your spare time learning and improving FOSS software. Arguably that is the most valuable contribution one can make to the community, but as a company, you can do just a little more: you can create a culture of giving back.

We’ve started something we’re really excited about, a framework for added automation functionality called Lapis Lazuli, and in doing so decided that we’ll open source every part of it that we can open source. We’ll do so bit by bit, and on no fixed schedule, but we have a few things in mind already.

For today, let’s make a small start with node-any-db-migrate.


node.js is an exciting platform to build software on, because its toolkit approach avoids locking you into a framework’s restrictions. As a case in point, there are about a gazillion modules out there that let you talk to databases, and a few more that try to be agnostic about the exact database you’re talking to.

We’ve evaluated a bunch of those for our use-case, and ended up choosing node-any-db. It appears to live up to its promise of being “thin and less-opinionated”.

For development and maintenance reasons, we also like the convenience of database migrations as popularized by Ruby on Rails. The basic concept is that instead of writing and maintaining a single file with your database scheme, each change you make to the scheme, whether an addition, modification or deletion, gets its own “migration” file. A migration file contains code to apply the changes, and code to remove the changes again. With that format, it’s possible to switch (migrate) between one set of changes and another pretty seamlessly.

Again, there are tons of modules out there that provide database migrations for node.js. However, none of them use our favourite node-any-db, leading to all sorts of inconsistencies and complications when you try to combine them.

Our node-any-db-migrate is a very simple migration script that uses node-any-db.

Example Usage

Get started with installing the package:

$ npm install any-db-migrate

The package contains a migrate script you’re going to use to start producing and applying migrations. But first, a JSON file is required that describes how to connect to your database. By default, it’s called database.json.

  "test": "sqlite3://test.sqlite3"

The file has a very simple structure. Each key represents an environment, such as a test, development or production environment. For each key, the value is a connection string such as one you would pass to anyDB.createConnection(). You specify which environment/database you want to use with the --environment command line parameter, or -e for short.

So let’s create our first migration.

$ migrate -e test create "my very first table"
Migration "1398849672905-my very first table" created.

Well that was simple. So what happened? Turns out the script created a migrations subdirectory, and in it a file called 1398849672905-my very first table.js. The first part of the file (and migration) name is just a timestamp, and the second part is the string you supplied on the command line.

Let’s take a look at the file contents.

exports.up = function(db, callback)
  // db.query('CREATE TABLE IF NOT EXISTS users (id integer, name text);', function(err, res) {
  //    callback(err);
  // });
exports.down = function(db, callback)
  // db.query('DROP TABLE users;', function(err, res) {
  //     callback(err);
  // });

As you can see, the file is a tiny node.js module that exports two functions up() and down(), which each accept a database transaction db and a callback. Any errors you generate need to be passed to this callback, or you pass null if everything went well, in typical node.js fashion.

For now, let’s not bother with a real database schema, and just replace those callback(null); lines with the commented-out lines above.

The next step is to apply this migration. To illustrate what happens a bit better, let’s add the -v flag to make the script tell you what it’s doing.

$ migrate -e test -v up
Options: { migrations_dir: './migrations',
  databases_file: './database.json',
  verbose: true,
  environment: 'test',
  arguments: [],
  command: 'up' }
Applying "up" operation on migration "1398849672905-my very first table"...
Registering changes...
Committing transaction.

As you can see, the up command will apply any unapplied migrations, and that means calling the up() function in each of those migration files. There’s a corresponding down command, and commands that let you apply and revert individual migration files you specify.

The power of migrate lies in the fact that it writes into the database which migration scripts it has run, so it’ll only apply those not yet run, in the order provided by the file name. Since the file names are timestamped, the order is the order in which they were created.

That’s all the magic there is. As said in the beginning, it’s a small start.

But we find this module useful, and hope you do as well!

Written by: Mark Barzilay

Graduated with honors from TU Delft in 2007 studying Electrical Engineering and Media & Knowledge Engineering. Founded spriteCloud in 2009 and worked on test automation ever since, helping out small and large companies with their test automation strategy and infrastructure. Mark is also leading the development on, an online platform for all your automated test results.

Subscribe to our mailing list!

Stay up-to-date on all things quality assurance,
test automation, and cybersecurity.

We’re spriteCloud, a leader in software and cybersecurity testing.

Aside from interesting articles, we also have a team of software testers that can help your organisation.

Have a look at our testing solutions.