Batching updates in `ManyToManyField` `add`

Hi, I have a ManyToManyField linking one model to another with a custom through model, as follows:

class ModelA(Model):
    models_b = ManyToManyField("ModelB", through="ModelAModelBMapping")
    class Meta:
         default_related_name = "models_a"

class ModelB(Model):
    pass

class ModelAModelBMapping(Model):
    model_a = ForeignKey("ModelA", on_delete=CASCADE)
    model_b = ForeignKey("ModelB", on_delete=CASCADE)

(I recognize that in most circumstances I don’t need to define ModelAModelBMapping; in my app I’m using django-simple-history on that mapping table to track the history of associations between the two models.)

As far as I can tell, the correct way to link one object to many others using the ManyToManyField is as follows:

a_instance: ModelA = ... # an instance of ModelA
models_to_link: Iterable[ModelB] = ... # a bunch of ModelB instances that exist in the DB
a_instance.models_b.add(*models_to_link)

I’m using Postgres as my database backend with Django 3.2. I’ve looked at the SQL queries that get generated and it seems that, regardless of how many instances are passed to the add call, Django will try to add all the models in models_to_link at once without doing any batching.

Is there a way to specify a batch size for these queries, or do I need to manually divide the add calls into batches?

You are not required to use the .add() method for ManyToMany relationships. You can create your own bulk create to the through model, creating those entries explicitly.

It is more work than doing the add, but you wouldn’t need to manage the batching yourself.

Oh, that’s a really good point - looks like bulk_create (with ignore_conflicts) will work fine and should even let me specify a batch size. Thanks!