Unsigned Int Auto Increment Primary Keys

I’m migrating a legacy project with a large mysql (Mariadb) database, containing around 200 tables.

The existing database has unsigned int auto increment primary keys.

Foreign keys are also unsigned int.

How do I replicate this in Django, the autofield and bigautofield seem to be signed. Is this going to cause me issues?

I can set it like this, but there is no auto increment.

id = models.PositiveIntegerField(primary_key=True)

No, this isn’t going to create any issues.

So how do I represent these unsigned auto increment primary key fields in a model?

You don’t. You treat them as signed when you migrate them to the new database and don’t worry about it at that point.

So the code would think they are signed and I would lose half the capacity?

Would foreign key references I add to existing models would go in as the wrong type in managed migrations?

The standard Django default for an id field is a BigAutoField. As an unsigned 63 bit field, its capacity is 9,223,372,036,854,775,807. I don’t think that’s going to be a problem.

I’m not following what you’re asking here.

You’re building a new application with new models. What you define in your new models are what’s going to be used.

All of the data will be saved as the new types when you migrate the data from the old tables to the new ones.

I have migrated the existing database, with --fake-initial.

All my primary keys are unsigned ints.

If I try and make any foreign key references to existing tables it fails because it turns the foreign key into an AutoField value (i.e. signed), but the primary key it points to is still unsigned.

The only way out of this I can see is to change all existing unsigned PKs to be signed. By doing so, I lose half the numeric capacity, or I use BigInt and double the storage requirements, plus cause myself a lot of hassle. Not ideal. Am I missing something?

It depends upon how much data you’re talking about.

Half the range of an AutoField is still > 2,000,000,000 rows.

I find it highly unlikely that you’re going to encounter a situation with precisely > 2,147,483,647 rows but less than twice that - which is the only case where this distinction matters.

I see no additional hassle here either way.

If you’re creating a new Django project, you’re going to want to create new models anyway. This is just part of the upgrade process.