Why won't Django look for my Postgres schema?!

A few days ago, I never had the problem with connecting to my DB service provider. Everything changed after my service provider migrated from IPv4 to IPv6 and gave my project a new Host and Username.

Simply changing the host and user “should” be enough to resume the connection, but apparently since then Django refuses to connect to the relevant schema.

# settings.py
SUPABASE_SEARCH_PATHS = [
  'project-schema',
  #'public', 
] 

DATABASES = {
    "default": {
        "ENGINE": "django.db.backends.postgresql",
        # 'ENGINE': 'django.db.backends.postgresql_psycopg2',
        # "ENGINE": f"{PROJECT_NAME}.custom_db_backends.CustomDatabaseWrapper",
       
        "NAME": "postgres",
        "USER": os.getenv('supabase_user'), #"postgres",
        "PASSWORD": os.getenv('supabase_project_pw'),
        "HOST": os.getenv('supabase_project_host_v2'), # supabase has migrated from ipv4 to ipv6
        "PORT": "5432",
        "OPTIONS": {
          'options': f"-c search_path={','.join(SUPABASE_SEARCH_PATHS)}",
        },
    },
}

If I run python manage.py migrate, apparently I will create new tables in “public”, even when I explicitly told it not to from settings.

From python manage.py shell, I tried querying the models and it complains about the lack of a table that was supposed to be from the schema.

>>> from app_user.models import User
>>> users = User.objects.all()
>>> for user in users:
...     print(user)

django.db.utils.ProgrammingError: relation "app_user_user" does not exist
LINE 1: ..."."is_staff", "app_user_user"."is_superuser" FROM "app_user_...

However, if I explicitly connect to that schema and table, I

from django.db import connection

with connection.cursor() as cursor:
    cursor.execute('SELECT * FROM "project-schema".app_user_user')
    rows = cursor.fetchall()
for row in rows:
    print(row)


('pbkdf2_sha256$600000$uVSZ1UIKjQ0wWFmhTHWLap$f/of2NcHSZ2QtwnvqF/EOsvcIXKRvmst3ELrdTGvDWo=', datetime.datetime(2024, 1, 31, 18, 7, 53, 526892, tzinfo=datetime.timezone.utc), '', '', True, datetime.datetime(2024, 1, 19, 19, 2, 14, 239681, 
tzinfo=datetime.timezone.utc), UUID('cd977349-0c5e-4383-ac98-0bbbe7ef803d'), 'example_user', 'example@gmail.com', datetime.datetime(2024, 1, 19, 19, 2, 14, 764524, tzinfo=datetime.timezone.utc), False, False, '01HMHHMBFCG6E51YGHP1JR8S6E'

I was able to get back the results I’m looking for, so it can’t be my db provider’s problem.

Something in django just refuses to search the schema I tells it to, which oddly never happened before I change my Hostname to earlier version.

Side topic: I also tried using a custom database wrapper method (from the parts I’ve commented out), and I could never properly import the engine, even though I have no issues with the path.

# python manage.py shell
>>> from db_backend import custom_backend
>>> from db_backend.custom_backend import CustomDatabaseWrapper
>>> CustomDatabaseWrapper
<class 'db_backend.custom_backend.CustomDatabaseWrapper'>


    raise ImproperlyConfigured(
django.core.exceptions.ImproperlyConfigured: 'db_backend.custom_backend.CustomDatabaseWrapper' isn't an available database backend or couldn't be imported. Check the above exception. To use one of the built-in backends, use 'django.db.backends.XXX', where XXX is one of:
    'mysql', 'oracle', 'postgresql', 'sqlite3'
# project/db_backend/custom_backend.py
# (I have __init__.py)
# (I also have registered db_backend in INSTALLED_APPS)

from django.db.backends.postgresql.base import DatabaseWrapper as Psycopg2DatabaseWrapper

class CustomDatabaseWrapper(Psycopg2DatabaseWrapper):
    def get_new_connection(self, conn_params):
        conn = super().get_new_connection(conn_params)
        search_path = "not-public, another_schema"
        with conn.cursor() as cursor:
            cursor.execute(f"SET search_path TO {search_path}")
        return conn

Let’s start from looking at some of the basics:

Please show what versions of Django, Python, and psycopg2 you’re using here. Also, what version of PostgreSQL are you connecting to?

As far as I’m aware, an identifier containing a hyphen must be quoted. Using not-public as a schema name is likely to cause problems in any number of areas.

For example: SET search_path to non-public; throws a syntax error in psql, while SET search_path to "non-public"; works.

(We avoid schema names with hyphens as a result of effects like this.)

I’d try changing the schema names from things like not-public to not_public.

I should update the progress of my problem. So the fix was to update the schema exposed by my db by priority.

Still, this fix doesn’t explain why django can’t find the schema I tells it to.

SUPABASE_SEARCH_PATHS = [
  #'public', 
  'test',
]  # doesnt respect the search path here 

DATABASES = {
    "default": {
        "ENGINE": "django.db.backends.postgresql",
        "NAME": "postgres",
        "USER": os.environ.get('supabase_user', 'postgres'), #"postgres",
        "PASSWORD": os.environ.get('supabase_project_pw', 'PASSWORD'),
        "HOST": os.environ.get('supabase_project_host', 'HOST'),
        "PORT": "5432",
        "OPTIONS": {
          'options': f"-c search_path={','.join(SUPABASE_SEARCH_PATHS)}",
          # schema search path doesnt work. You need to go to supabase SQL editor and do ALTER USER "name" SET search_path TO "schema"
        },
    },
}

sql server query

ALTER USER "postgres" SET search_path TO public, test;

from django.db import connection
with connection.cursor() as cursor:
    cursor.execute('SHOW search_path')
    search_path = cursor.fetchone()
    print(search_path)

>>> ('public, test',) # from sql server settings, expect only 'test' schema since that is the path I written on django settings, but yet it displays the settings I made from sql server query

I can’t recreate the symptoms you are describing in my test lab.

With my database settings including this:

        'OPTIONS': {
            'options': '-c search_path=tskww'
        },

A shell_plus session yields this:

In [1]: from django.db import connection

In [2]: cur = connection.cursor()

In [3]: cur.execute('show search_path')

In [4]: cur.fetchall()
Out[4]: [('tskww',)]

Also, working from a straight (non-Django) Python shell, the following works:

>>> conn = psycopg2.connect("<connection info redacted> options='-c search_path=tskww'")
>>> cur = conn.cursor()
>>> cur.execute('show search_path;')
>>> cur.fetchall()
[('tskww',)]

So, can you be more specific about what exactly is failing and how?

Side note: Django doesn’t do anything with the search_path that is set in the OPTIONS setting. It’s passed straight-through to psycopg2.

Thanks for going through this problem with me.

Weird. I am also using django shell_plus notebook environment

import os
os.environ['DJANGO_ALLOW_ASYNC_UNSAFE'] = 'true'

import django
from django.db import connection
django.setup()

django.conf.settings.DATABASES
{'default': {'ENGINE': 'django.db.backends.postgresql',
  'NAME': 'postgres',
  'USER': 'username',
  'PASSWORD': pw,
  'HOST': host,
  'PORT': '5432',
  'OPTIONS': {'options': '-c search_path=test'}, # <<< explicit search path
  'ATOMIC_REQUESTS': False,
  'AUTOCOMMIT': True,
  'CONN_MAX_AGE': 0,
  'CONN_HEALTH_CHECKS': False,
  'TIME_ZONE': None,
  'TEST': {'CHARSET': None,
   'COLLATION': None,
   'MIGRATE': True,
   'MIRROR': None,
   'NAME': None}}}
#
with connection.cursor() as cursor:
    cursor.execute('SHOW search_path')
    search_path = cursor.fetchone()
    print(search_path)

# ('"\\$user", public, extensions',) # << where django is gonna migrate
# explicitly setting search path from shell
DATABASE_SCHEMAS = ['test']

with connection.cursor() as cursor:
    cursor.execute("SET search_path TO %s;" % ','.join(DATABASE_SCHEMAS))

with connection.cursor() as cursor:
    cursor.execute('SHOW search_path')
    search_path = cursor.fetchone()
    print(search_path)

# ('test',) # django now properly finds the schema!

Given this quirk, is there a way to set up a signals for pre-migrations so that the path is discovered accurately? Would this signal cause inconsistency between migrations in dev environment (db.sqlite3) and production environment (psql) ?

@receiver(pre_migrate)
def set_search_path(sender, **kwargs):
    with connection.cursor() as cursor:
        search_path = settings.DATABASE_SCHEMAS[0]
        cursor.execute(f"SET search_path TO {search_path};")
        print(f"Search path set to: {search_path}")

I came up with a solution by using pre-migration signals.

Problem is that while it does the trick, it introduces yet another fatal bug.

If ‘public’ (which is the default table) doesnt contain django_migrations table, then migration to ‘test’ will always result in duplicate table error.

Inversely, if ‘public’ contains django_migrations, django never migrates to ‘test’ regardless of schema changes because it respects the table from ‘public’ first.

The same issue keeps coming back. Django always look into ‘public’ first before doing anything else. Pre-migration signal is a one time trick that hides that fundamental problem.

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': 'pgtest',  # Your database name
        'USER': 'postgres',  # Your database user
        'PASSWORD': 'password',  # Your database password
        'HOST': 'localhost',  # Your database host
        'PORT': '5432',  # Your database port
        "OPTIONS": {
          'options': f"-c search_path={','.join(DATABASE_SCHEMAS)}",
        },
    },
}

I confirmed your results from your test lab using local postgres server. Migration happens to different schemas without any issues.

Something about my db provider is not respecting django settings database options.

It seems like there is hardly a way to avoid passing this option from django without being troublesome. So I need to understand what is causing the provider to not respect this option.

        "OPTIONS": {
          'options': f"-c search_path={','.join(DATABASE_SCHEMAS)}",
        },