We recently needed to move a few of our Django sites from their existing Postgres databases to MySQL.

We didn’t use any Postgres-specific database features, so our initial approach was to use MySQL Workbench to migrate the databases directly, but after much tweaking and several failed attempts we realised a more straightforward solution; use JSON as an intermediary format, and Django’s fixture system to do the work.

The management command we’re after is dumpdata - this extracts data to a neutral JSON representation, that we can then import using loaddata into a new MySQL database.

Though this is a fairly reliable process (certainly much more reliable than converting between SQL formats), it does have a few notable bugs, and it is slow. Django is loading every row in your database into memory, creating the JSON representation, and then writing it to disk, so this approach does take a lot longer than a normal database insert operation, especially if you are dealing with large data sets.

Enough warnings over, on to the process.

First, we need to get our Django settings in place. In your settings file you’re likely going to have something like this:

    'default': dj_database_url.config(

You need to add a new MySQL settings:

    'default': dj_database_url.config(
    'mysql': {
        'ENGINE': 'django.db.backends.mysql',
        'NAME': 'my-database',
        'USER': 'my-user',
        'PASSWORD': 'my-password',
        'HOST': 'localhost',

Once that’s complete, we create the tables in the new MySQL database (note that for older versions of Django you’ll need to syncdb):

manage.py migrate --database=mysql --no-initial-data

Now, despite the no-initial-data argument, it’s worth manually checking new tables are empty, otherwise you might find some foreign key conflicts on import. If you want a quick set of SQL commands to truncate all your tables, use:

manage.py sqlflush --database=mysql

Next up, exporting our data from Postgres. First the command:

manage.py dumpdata --natural --all --indent=2 --exclude=sessions > /tmp/data.json 
  • --natural tells our command to use natural keys.
  • --all ensures we use Django’s base manager, to avoid conflicts with custom managers.
  • --indent is optional, but helps make the file a little more human-readable.
  • --exclude again optional, however this can be handy to reduce the export time.

Now that we have a JSON file containing all our data, it’s time to get it into our newly prepared database:

manage.py loaddata data.json --database=mysql

Now, all of your data should be available in your new MySQL database exactly as it was before. To make this your new default database it’s just a case of changing the default databse in settings over and you’re finished.

The above process tends to work fine for straight-forward database schemas, but you might find you have some problems on the loaddata stage if you have some complex foreign key relationships. The most likely cause of these errors is the data restore order, and this is where the exclude argument is required.

It is simple enough to figure out which apps/models need to get imported first based on the error messages; you can then pass in the relevant arguments when dumping your data to import it first, and then follow up with a second export of the rest of the data.

As a guide to the arguments, the below example outputs all model data from the test app other than the Foo model data as well as the UserProfile model from the users app:

manage.py dumpdata --natural --all --indent=2 test users.UserProfile --exclude=test.Foo > data.json 

The Django JSON importer works by creating blank entries in required FK tables when importing, with the relevant foreign key data filled but all other fields blank. This means that the PK will be automatically generated and you’ll likely end up with broken foreign keys. The solution:

  • export all the models that don’t cause conflicts on import.

  • import them into the new database.

  • manually truncate the offending auto-populated table(s) using something like:

     TRUNCATE `profile_userprofile`;
  • export the remaining models.

  • import those into the new database.

This will be a process of trial and error and might need a couple of extra load/delete steps for more complex set ups. Of course, this can be helped by having a good understanding of the underlying models and their relationships, which should give you a good idea of what’s likely to break.