Hi community,
I’m not familiar with Django’s ways of working, but perhaps this might be relevant or interesting to some of the developers hanging around here or maybe for the Django project.
Our project has a scenario where data is stored in large JSON blobs in the database. The data didn’t need to be searched dynamically (one of Django’s cool ORM abilities) but was just stored as an intermediate for later processing. Storing this as a jsonfield does not compress the data enough as it needs to be accessed quickly.
To support this scenario we’ve written a field that allows you to transparently read and write json, but saving is done in gzip compressionlevel 4. This saves us 130 gigabyte of data while not having having to change any of the other logic in the program. This is absolutely neat and this is something that might be very useful for other developers out there.
The field is a subclass of jsonfield that just adds compression to the class. The rest is done via the superclass. Note that this is our first custom field. The process of making this, and making sure it works for sqlite, mysql and postgres was pretty smooth.
It auto-upgrades existing values. We’ve saved 130 gigabyte of disk space. And we don’t have to work with random files on the disk that might or might not be deleted when deleting the record.
Code: websecmap/app/fields.py · master · Internet Cleanup Foundation / web security map · GitLab
Hopefully it’s useful to someone or even make it into the Django project. As this is our first ever written custom field there might be things that are ‘off’. It worked for us seamlessly in just a few hours of testing and developing. Thank you for creating Django
Regards,
Elger Jonker
Programmer on Web Security Map