Multi-DB migrations are applied but no tables are created during running tests

We’re having a problem with migrations being applied during construction of test databases, but no tables being created. Unfortunately our use case is a tad complicated, so please bear with me. I’ll shorten the code, hopefully not cutting out important stuff. We’re using Django 3.2.9 on Python 3.7.6

We have a model Project that acts as a tenant, sort of, in the sense that data belonging to a given tenant is stored in database exclusively for that project.

class Project(models.Model):
    short_name = models.CharField('Machine-friendly short name', max_length=50)
class DataPoint(models.Model):
    is_external = True
    # a bunch of other fields

    class Meta:
        db_table = 'data_point'

Some models are “external”, meaning that they are replicated across each project database, but not the default databases where the usual Django stuff is stored. To make that work we have a router handling this:

class ProjectDBRouter:
    """The router handling the task of sending requests to the project databases
    to the right place.

    This is necessary for three reasons:
        1.) The project databases have a completely different structure than
            the database used by Django and should only contain project data
        1.) The project databases in production are not to be managed by Django
        2.) The project test databases for local tests have to be constructed
            and populated completely by the testing code.

    Therefore this router has two tasks:
        1.) Sending data for reading and writing to the right databases based
            on the model (is it internal to Django or external) and if it is
            currently performing tests or is in production mode
        2.) Allowing or disallowing migrations based on what model they concern,
            if that model is internal/external and if the migration is part of
            creating/populating a test database
    #pylint: disable=unused-argument,invalid-name

    def check_db(self, model):
        """Internal function to check which to Db to read/write from/to

            model (django.models.Model): The model class to check
            str: The name of the database to use or None if the router is
                indifferent about it
            LookupError: If no tenant was set. This can happen when a function
                in the call stack does not use the project context manager or
                a view requiring a project is called on a non-project route
        # external models ought to have an is_external attribute set to True
        if hasattr(model, "is_external") and model.is_external:
            db_name = get_tenant()
            if db_name is None or db_name == "":
                raise LookupError("No tenant was set in ProjectDBRouter for external model")

            if "test" in sys.argv:
                return TEST_PREFIX + PROJECT_PREFIX + db_name

            return PROJECT_PREFIX + db_name

        return None

    def db_for_read(self, model, **hints):
        """@see routers.check_db"""
        return self.check_db(model)

    def db_for_write(self, model, **hints):
        """@see routers.check_db"""
        return self.check_db(model)

    def allow_relation(self, obj1, obj2, **hints):
        # return None means that router is indifferent
        return None

    def allow_migrate(self, db, app_label, model_name=None, **hints):
        externals = get_external_models()

        # if there are hints, we can possibly use these directly
        if "target_db" in hints:
            return db == hints["target_db"]

        if "is_external" in hints and hints["is_external"]:
            if db == "default":
                return False
            if "is_test_data" in hints and hints["is_test_data"]:
                return True

        # the default database only accepts internal models or migrations
        # that are not model specific
        if db == "default":
            return model_name is None or model_name not in externals

        # the project databases only accept migrations for the external models
        # if it is a test database
        if model_name is not None and model_name in externals:
            return db[0:len(TEST_PREFIX)] == TEST_PREFIX \
                and db[len(TEST_PREFIX):len(TEST_PREFIX)+len(PROJECT_PREFIX)] == PROJECT_PREFIX

        # none of the other checks caught something, so better safe than sorry
        return False

Don’t worry about functions get_tenant and get_external_models, they seem to work as intended. Of note is that this router implementation means that the project databases do not accept migrations unless it is a test database. The project database connections are dynamically constructed in

cache_filename = ("test_" if IS_TEST_CMD else "") + "project_list.dat"
with open(os.path.join(BASE_DIR, "caches", cache_filename), "rb") as file:
    projects = pickle.load(file)

    for project in projects:
        ALLOWED_HOSTS.append("{}.{}".format(project["short_name"], SITE_DOMAIN))

        if IS_TEST_CMD:
            conn_name = TEST_PREFIX + PROJECT_PREFIX + project["short_name"]
            DATABASES[conn_name] = {
                'ENGINE': DATABASE_BACKENDS["sqlite"],
                'NAME': conn_name,
                'TEST': {
                    'NAME': "{}.sqlite".format(conn_name)
            DATABASES[PROJECT_PREFIX + project["short_name"]] = {
                'ENGINE': DATABASE_BACKENDS[project["backend"]],
                'NAME': project["db_name"],
                'HOST': project["host"],
                'PORT': project["port"],
                'USER': project["user"],
                'PASSWORD': project["password_crypt"],

Reading a list from a file cache is not ideal and we hope to replace this workaround with something better eventually. However, at least it does allow to specify exactly which project database connections are defined, which is useful for testing. The external model DataPoint above is created as part of a normal migration:

operations = [
                # a bunch of fields
                'db_table': 'data_point',

Now, let’s say we have a relatively normal test:

class ViewTests(TestCase):
    fixtures = ["test_users.json"]
    databases = [

    def setUpClass(cls):

    def test_index_not_logged_in(self):

        response = self.client.get("/projects/")
        self.assertEqual(response.status_code, 302)

The two project connections mentioned are the same two specified in the project list cache for tests and the fixture also contains data to create the Project objects in the default database after migrations are done (which is also why we need to specify which projects exist before the databases are populated).

Now, if we run the tests with python test, the test fails with sqlite3.OperationalError: no such table: data_point. To investigate we set a breakpoint and run the debugger (VS code, but it shoudn’t matter). While stopped we can look at the project test databases. They contain only two tables django_migrations and sqlite_sequence, but not data_point. In django_migrations we can see that the migration that should create the table was applied, yet no table was created.

We also set a breakpoint in ProjectDBRouter.allow_migrate and cycled through the calls until we got to the call that checked a migration on the model DataPoint against the database tst_prj_aperture-science-labs and the router returned True as expected.

So in summary, we can confirm that the migration is allowed on the test database and is indeed applied, but no table is created. I must be overlooking something simple.

In searching for similar problems I came upon this thread and initially thought it not relevant. However upon closer inspection I noticed that the migration in my question is applied to the default test database, when it should not be and indeed is not applied during a normal run of python migrate. This suggests that something causes the migration process during construction of test databases to be confused about whether or not to apply a migration to a given database; a confusion that doesn’t seem to happen in manual execution of the migration process.

In addition, the presence of a migration in table django_migrations seems to only indicate that the migration as a whole was executed/applied, it does not indicate if any/all of the operations in it passed allow_migrate of all routers. That makes sense since a migration could have any number of different operations in it and it might be more helpful to think about allow_migrate in the context of operations, not migrations.

Since writing the post above we have managed to solve the problem. Remember when I said "Don’t worry about function get_external_models? There is a saying about you’re making when you assume…

Here’s how that function works:

# per-request cache for a dict of external models. use method get_external_models
# to access the dict.
# this dict MUST be lazy-loaded since trying to load it during module loading
# would run into cyclical dependencies between modules

def get_external_models():
    if EXTERNAL_MODELS == {}:
        for app in APPS_EXTERNALS:
            config = apps.get_app_config(app)
            models = config.get_models()
            for model in models:
                if hasattr(model, "is_external") and model.is_external:
                    EXTERNAL_MODELS[str(model.__name__).lower()] = True


Since we’re still in the process of making tests work in this software project, this method was tested only manually and it did what was expected. However no tests means no protections against regression and the external models were recently split into their own file without being imported into the app’s This works fine when importing the models somewhere else (for example in the API), as they’re registered upon use, but it doesn’t work if you trying to iterate over models of an app to determine which are external…

Moral of the story: Don’t assume. Check.

1 Like