I’m writing a custom model field and would like it to come with a database CHECK constraint. The actual use case is DB validation of Django’s choices , to avoid invalid values (since the data can be edited from outside of Django).
What is your question? Post your code
In my experience, the simplest way to define a database constraint is at the model level, rather than at the (custom) field level.
class Format(Model):
right_column = PositiveSmallIntegerField(default=12)
class Meta(TypedModelMeta):
constraints = [
CheckConstraint(
condition=Q(right_column__lte=24),
name='right_column <= 24'
)
]
If you’re very interested in keeping the constraint(s) close to the custom field definition, the contribute_to_class has some gotchas but can be made to work:
To avoid weird duplication during migrations I had to bail on cls.__module__ == '__fake__':
def contribute_to_class(self, cls: type[Model], name: str, private_only: bool = False) -> None: # noqa: FBT001, FBT002
super().contribute_to_class(cls, name, private_only=private_only)
if cls.__module__ == '__fake__':
return # Avoid duplicate constraints when migrating
# Ensure ModelState.from_model() considers constraints
cls._meta.original_attrs['constraints'] = cls._meta.original_attrs.get('constraints', [])
cls._meta.constraints.append(
CheckConstraint(
condition=Q(f'{name}__lte'=24),
name=f'{cls._meta.db_table}.{name} <= 24',
)
)
In general, database constraints can be column-specific or for the whole table. A constraint behaves identically no matter where it’s attached, so the choice probably only affects raw SQL schema readability. If you’re very interested in a column constraint, the upper section of the above link explains how to implement the constraint in a way ends up looking like the constraint of a PositiveIntegerField. But it’s implemented in a more database-independent way than the backend-specific data_type_check_constaints dictionary, by overriding the db_check method.