Django create clone objects and keep them in replica database

After reading django multiple database documentation we can use multiple database but can’t keep clone object in our database. Let explain little bit assume you have two model teacher and student. You can use two separate database for teacher and student but you can’t use another database for keep clone objects of your teacher and student model. So here we will use django siganls for keep clone objects in our replica_database. Signals will be triggered and create clone object whenever any objects create in our model. Here is my code:

'default': { 
        'NAME': 'primary_database',
        'ENGINE': 'django.db.backends.mysql',
        'HOST': 'localhost',
        'USER': 'root',
        'PASSWORD': '',
    'replica1_database': {
        'NAME': 'replica1_database',
        'ENGINE': 'django.db.backends.mysql',
        'HOST': 'localhost',
        'USER': 'root',
        'PASSWORD': '', },

from django.db.models.signals import post_save
from django.dispatch import receiver

class Contact(models.Model):
      name = models.CharField(blank=True, null=True, max_length=100)
@receiver(post_save, sender=Contact, dispatch_uid="clone_objects")
def replica1_databse(sender, instance, created, **kwargs):
   if created: #cerate clone object
        obj = instance'replica1') 
    else: #updating clone object
        obj = Contact.objects.using('replica1').update(

Here I am triggering signals for create an clone objects in my replica1_database.

Now run python makemigrtions contact and python migrate contact this two migrations applied in your default database. This is the most important step ----> You have to run python migrate --database=replica1 This migrate applied for your replica1 database .

I Think it also an good idea to keep an backup database for avoid any unexcepted situation such as server down.

First, there are no guarantees that the two databases will remain identical. There are any number of race conditions existing that would cause the primary keys to get out of sync with each other.

This really isn’t something that should be handled within an application. Both PostgreSQL and MySQL/MariaDB have database-layer replication options that are going to be far more robust than what can be done within Django.

Thanks for your suggestions. I really don’t know that those database have own replication method.

yes, nothing anyone will do at application level will work as well as database-level replication (master/slave, master/master, reference/replica, use the new parlance or the old one – principle is the same).

one master - n slaves is very often used and as Ken pointed out, will ensure consistency across all instances.

Now this will enable you to scale for read operations.
Scaling for write operations will be a little more involved and can (should?) come later.

In either case, you can start by using database routers at django level – or the equivalent provided by you database vendor (whichever it may be).

At python/django level, it is a surprisingly easy thing to do: Multiple databases | Django documentation | Django

db_for_write will most likely be your master
db_for_read will most likely be one of your slaves.

Setting up a mysql master/slave is not too hard. Easy to test on a local installation. If you intend to go that route (pun intended haha), though that is beyond django in general, and you use Mysql:

  1. use gtids
  2. use semi sync
  3. do not store sessions in the database – use another storage backend or deactivate their replication.

Have fun! :slight_smile: