Table does not exist when testing

I am getting an error when running unit tests:

psycopg2.errors.UndefinedTable: relation "generic_sample_meta_data" does not exist
LINE 1: INSERT INTO "generic_sample_meta_data" ("name", "prefix", "c...

My situation is that I am trying to write a Django project connecting to an existing database. I am using Python 3.11.0, Django 5.0.6 and the databae is PostgreSQL, on Windows 11.

The lines causing the fail are:

        meta = GenericSampleMetaDatum(name='Scrubber', status=0, prefix='Sc')
        meta.save()

My model is set up like this:

class GenericSampleMetaDatum(models.Model):
    name = models.CharField(max_length=255)
    prefix = models.CharField(max_length=5, blank=True, null=True)
    columns = models.CharField(max_length=255, blank=True, null=True)
    #...

    class Meta:
        managed = False
        db_table = 'generic_sample_meta_data'

If I do makemigrations and migrate it tells me these are all up to date. There are several tables in the migration, but this is the relevasnt bit:

class Migration(migrations.Migration):

    initial = True

    dependencies = [
    ]

    operations = [
        migrations.CreateModel(
            name='GenericBatchMetaDatum',
            fields=[
                ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
                ('name', models.CharField(max_length=255)),
                ('prefix', models.CharField(max_length=5)),
                ('columns', models.CharField(blank=True, max_length=255, null=True)),
                # ...
            ],
            options={
                'db_table': 'generic_batch_meta_data',
                'managed': False,
            },
        ),
    ]

I have yet to get the web site up and running, but if I go into the shell, I can confirm that the database table is there, with data (I can also see it in pgAdmin).

>>> from samples.models import *
>>> GenericSampleMetaDatum.objects.count()
33
>>> GenericSampleMetaDatum.objects.first()
<GenericSampleMetaDatum: GenericSampleMetaDatum object (2)>

Of course, that is the development database; not the testing database. The problem is the table is not in the testing database, and I wonder if that is because the database is flagged as not managed.

I found this article, which has two solutions. I tried the first, modified for more recent Django.

So my test runner is this:

from django.test.runner import DiscoverRunner

class ManagedModelTestRunner(DiscoverRunner):
    def setup_test_environment(self, *args, **kwargs):
        from django.apps import apps
        self.unmanaged_models = [m for m in apps.get_models()
                                 if not m._meta.managed]
        for m in self.unmanaged_models:
            print(m)
            m._meta.managed = True
        super(ManagedModelTestRunner, self).setup_test_environment(*args,
                                                                   **kwargs)

    def teardown_test_environment(self, *args, **kwargs):
        super(ManagedModelTestRunner, self).teardown_test_environment(
                 *args,
                 **kwargs
            )

        # reset unmanaged models
        for m in self.unmanaged_models:
            m._meta.managed = False

But that made no difference. It does print out the unmanaged models, so this code is getting used, but either it is not setting the table to be managed, or it is doing it too late, or this is not the issue at all.

At this point I am at a loss as to how to proceed!

Edited to add:

I also tried changing the database to managed in the model definition, and that did not fix it. Also changing the setting in the migration itself to no avail. Note that I did not run makemigrations or migrate.

I tried running the test with --keepdb, and inspected the database to confirm that indeed that table is missing.

All this is making me think the database schema or a summary migration is maintained somewhere, and Django is using that?

why use managed=False?
i think this set managed=True and migraion is solution.

Because this is an existing database, and I do not want Django to mess around with the tables full of data.

I have resolved the issue by deleting the tables from the development data base, with the hope I can import them again later. Django has “managed” as an option, so what I am doing is well-established, so there must be a better approach.

have you foreign key or many to many field in GenericSampleMetaDatum model?
if it have, write all model code and error massage.

if not, i recoomend managed=True and python manage.py migrate --fake {GenericSampleMetaDatum app name} to you.

use --fake do migrate but not modify database.

OR,
how about proxy?