I posted a comment on ticket 24306 about possibly using unlogged tables for the database cache. Would there be any interest in adding that functionality to the createcachetable
command?
createcachetable
is database backend agnostic. I don’t think we can reasonably add a PostgreSQL-specific option there.
It’s possible to take the cache table SQL and put into a migration. Then you can customize it as you like, including adding the UNLOGGED
option.
In Django-MySQL, I created a MySQL/MariaDB-specific cache backend. It provides a command for creating such a migration for its table: Cache - Django-MySQL 4.12.0 documentation . Django could take this approach.
Thanks - I like the idea of putting the cache table SQL into a migration.
Is anyone interested in having a Postgresql-UNLOGGED database cache backend in Django, or is that too specific?
Not trying to be rude but have you done the homework to measure whether there’s any real performance gains with an UNLOGGED cache table? Because from my own testing and other folks anecdotes is that it’s really not that noticeable.
If we’re talking about a read-heavy workload then all you’d need to do is up the memory & adjust the settings and Postgres will already be as fast as it’s going to get. Unfortunately there’s no secret performance setting other than adjusting mem/settings to the workload.
If Christophe Pettus (Postgres/Django expert on the old ML) was active on the forum he could chime in with some expert advice. Failing that I’d recommend watching some guys presentations & reading his articles, they’re quite good!
If you do end up measuring I’d be keen to hear what the results are!
Yeah, that’s a good point. We saw an article that talked about good gains in performance for doing updates, but for a cache that’s read-heavy I guess it doesn’t make as much of a difference. I did do some very rough performance checking, and the regular DB table was similar to the UNLOGGED table.