Signal does not pick up update_field if the model's Manager filters objects

So I have a Model that has some sensitive objects labeled with is_private=True.
Furthermore, all private objects should be filtered out by default, unless there is a User, in which we can check if they own the object.

class SensitiveObjectManager(models.Manager):
    def __init__(self):
        super().__init__()
        self.user = None

    def for_user(self, user):
        """Create a new manager instance with the user context"""
        manager = SensitiveObjectManager()
        manager.model = self.model
        manager.user = user
        if hasattr(self, "core_filters"):
            manager.core_filters = self.core_filters
        return manager

    def get_queryset(self):
        qs = super().get_queryset()

        if hasattr(self, "core_filters"):
            qs = qs.filter(**self.core_filters)

        if self.user is None:
            return qs.filter(is_private=False)
        return qs.filter(Q(is_private=False) | Q(owner=self.user.id))


class SensitiveObject(models.Model):
  objects = SensitiveObjectManager()
  all_objects = models.Manager()

  is_private = models.BooleanField(default=True)
  owner = models.ForeignKey(User)
  is_leaked = models.BooleanField(default=False)

This is designed because SensitiveObject.objects.all() is commonly used throughout our code, but with this Manager, we can always filter out the private objects. To include objects that the user owns, we can use SensitiveObject.objects.for_user(User).all().

Everything works fine with it so far except for a very odd bug using django.db.models.signals.post_save. We want to catch when is_leaked is updated so that we can update a few other Models. So we have

post_save.connect(sensitive_object_updated, sender=SensitiveObject)

And then finally:

def sensitive_object_updated(sender, instance, created, update_fields, **kwargs):
  print(instance)
  print(instance.is_leaked)
  print(update_fields)

If the SensitiveObject.is_private=True, then the update_fields will NOT be included which is very annoying. But when SensitiveObject.is_private=False, the update_fields work fine. I tested this, and I know it is because of the SensitiveObjectManager. Furthermore, if you test it, you will see the instance and instance.is_leaked have the correct values inside sensitive_object_updated always, (whether if is_private is True or False). I’m pretty confident that this is a bug with Django.

Perhaps you could try: post_save.connect(sensitive_object_updated,sender=SensitiveObject._meta.get_field('all_objects').model)

I am not sure if it is the best solution but all the fields could be updated this way - at least on theory :slight_smile:

1 Like

Thanks for the reply. That’s a pretty good idea but unfortunately does not work. I like the idea of trying to force the Signal to make sure it checks SensitiveObject.all_objects instead of the SensitiveObject.objects, so I tried changing the signal implementation to:

post_save.connect(sensitive_object_updated, sender=SensitiveObject.all_objects.model)

But the result is still the same where update_fields is still None if SensitiveObject.is_private=True.

(Note that SensitiveObject._meta.get_field('all_objects') and SensitiveObject._meta.get_field('objects') doesn’t work as it will raise the error:

django.core.exceptions.FieldDoesNotExist: SensitiveObject has no field named 'all_objects'

)

hmm…
Perhaps you could try to change the default manager:

class SensitiveObject(models.Model):
    # Define all_objects first to make it the default
    all_objects = models.Manager()
    objects = SensitiveObjectManager()

    is_private = models.BooleanField(default=True)
    owner = models.ForeignKey(User)
    is_leaked = models.BooleanField(default=False)

    class Meta:
        default_manager_name = 'all_objects'  # Explicitly set default manager

# Then simple signal connection
post_save.connect(sensitive_object_updated, sender=SensitiveObject)

Surprisingly, this still did not work. After testing, the result is still the same (where saving an object with SensitiveObject.is_private=True will not have the update_fields in sensitive_object_updated, but when changing it to SensitiveObject.is_private=False, the update_fields work fine. )

To make things more difficult, another perk of the SensitiveObjectManager is that it will also automatically filter sensitive_objects from related models. For example:

class Groups(models.Model):
    sensitive_objects=models.ManyToManyField(
        SensitiveObject, related_name="sensitive_object_groups", blank=True
    )

If I have a group object

group = Groups.objects.create()
group.sensitive_objects.set(sensitive_objects_list)
group.sensitive_objects.all() # Returns only sensitive_objects where is_private is false
group.sensitive_objects.for_user(User).all() # Returns sensitive_objects the user has access to

The flexibility of this is great and already works with the current set-up. But I just wanted to mention this because when using

class SensitiveObject(models.Model):
    class Meta:
        default_manager_name = 'all_objects'

Then group.sensitive_objects.for_user(User) will actually raise the error: AttributeError: 'ManyRelatedManager' object has no attribute 'for_user'

I’m about to create a ticket for this as it does seem like a bug with Django. I just wanted to add to this forum the test case I am using.

# tests.py
from django.test import TestCase
def SensitiveObjectTest(TestCase):
   def example_test(self):
      private_object = SensitiveObject.objects.create(
         owner=self.user,
         is_private=True,
         is_leaked=False
      )

      self.assertTrue(SensitiveObject.all_objects.filter(id=private_object.id).exists())
      # Thanks to the SensitiveObjectManager, private objects are filtered out of objects by default.
      self.assertFalse(SensitiveObject.objects.filter(id=private_object.id).exists())
      self.assertTrue(SensitiveObject.objects.for_user(self.user).filter(id=private_object.id).exists())

      private_object.is_leaked=True
      private_object.save()
      # Should see the print statements from `senstive_object_updated`.
      # update_fields will be None 

      public_object = SensitiveObject.objects.create(
         owner=self.user,
         is_private=False,
         is_leaked=False
      )
      self.assertTrue(SensitiveObject.all_objects.filter(id=public_object.id).exists())
      self.assertTrue(SensitiveObject.objects.filter(id=public_object.id).exists())

      public_object.is_leaked=True
      public_object.save()
      # Should see the print statements from `senstive_object_updated`.
      # update_fields will be correct 

      
# apps.py

def sensitive_object_updated(sender, instance, created, update_fields, **kwargs):
  print("Instance:")
  print(instance)
  print("Instance.is_private:")
  print(instance.is_private)
  print("Instance.is_leaked:")
  print(instance.is_leaked)
  print("update_fields:")
  print(update_fields)

from django.apps import AppConfig

class SensitiveObjectConfig(AppConfig):
    default_auto_field = "django.db.models.BigAutoField"
    name = "sensitive_object"

    def ready(self):
        from django.db.models.signals import post_save
        from sensitive_object.models import SensitiveObject
        post_save.connect(sensitive_object_updated, sender=SensitiveObject)

As pointed out on Trac there is a misunderstanding of what update_fields is here.

It’s not the set of model fields that were set to a different value since the last save call, Django doesn’t keep track of that, but simply the value passed to Model.save(update_fields) to denote which fields should be part of the resulting UPDATE query.

Since no explicit update_fields is passed to save calls in the above code it will always be None as expected.

1 Like

Ah, thanks for that. charettes was spot on, all it needed was `Model.save(update_fields=[“is_leaked”]).

For those wondering (or for myself in case I forget in the future), there is an unfinished explanation for how this doesn’t really add up here:

Since no explicit update_fields is passed to save calls in the above code it will always be None as expected.

If you fully read this thread, this might not seem accurate as the update_fields are somehow tracked when Model.is_private=False. If you take a look at the example test file (full example is above):

      public_object = SensitiveObject.objects.create(
         owner=self.user,
         is_private=False,
         is_leaked=False
      )
      self.assertTrue(SensitiveObject.all_objects.filter(id=public_object.id).exists())
      self.assertTrue(SensitiveObject.objects.filter(id=public_object.id).exists())

      public_object.is_leaked=True
      public_object.save()
      # Should see the print statements from `senstive_object_updated`.
      # update_fields:
      # frozenset({'is_leaked'})

The issue here is that I was not fully accurate in describing the code. To be fully accurate, “since no explicit update_fields is passed to save calls in the above code it will always be None as expected” that is assuming this is the django’s default save method, which could be overridden by models inheriting the models.Model.

I oversimplified the definition of class SensitiveObject(models.Model). In our code, it’s actually more like

class SensitiveObject(CustomModel)
...

class CustomModel(models.Model):
   def save(self, *args, **kwargs)
     # ... Tracks the updates to the object and appends it to update_fields 
     old_object = SensitiveObject.objects.filter(pk=self.pk) # This had to be updated to SensitiveObjects.all_objects.filter...
     # ...
     super().save(*args_with_tracked_update_fields)