class PublisherManager(models.Manager):
def get(self, *args, **kwargs):
....
class Publisher(models.Model):
name = models.CharField(max_length=200)
class Book(models.Model):
publisher = models.ForeignKey(Publisher, on_delete=models.PROTECT)
name = models.CharField(max_length=255)
There is a DRF serializer that returns the books and publisher associated with the book. There are thousands of books whereas only a hundred publishers. The data distribution is 1:1000 and hence it makes sense to store publisher objects in Python app cache after the first load till the next app restart.
How can I cache the publisher using lru_cache or any cache in Python Process?
Constraints and other info
- I’m already using Django cache with redis backend for other purposes and I don’t think redis would give a significant boost(same network call + lookup).
- Adding a custom manager to the Publisher model with
get, filter
,all
methods doesn’t help.all
method gets called while loading a foreign key but there is no SQL parameters in the manager instance. - The object lookup happens in the related description get.
Is there a way to change the behavior without writing a custom Field? Idea is to use cache for subsequent lookup for a period of time.