My server infrastructure should consist of two physical servers.
This allows me to switch off a server for longer periods of time and take more time for maintenance. Or use one of the two for testing or staging.
Server1 with an active HAProxy for load balancing to Nginx on Server1 and Server2. I know Nginx can do that too, but I prefer HAProxy for other reasons.
Server2 with an passive HAProxy in case Server1 goes down.
Both, Server1 and Server2 running Nginx, Gunicorn and Django.
Server1 with active MariaDB Database “Customer”, Server2 with an daily backup. So Server1 normaly used DB on Server1.
So far it shouldn’t be a problem. But what about the internal Django tables, like auth_user, django_session, django_admin_log and so on?
At first I had thought of simply managing these internal Django tables separately in local databases on both servers.
But I store my Ajax visitor input in the Django sessions. So it would be bad if both servers didn’t have access. I don’t trust the header data and load balancing. Therefore, I would like that both servers can always access the sessions.
Can I also use the shared database on Server1 for the “internal Django tables”? Then I would put the sessions for both servers on Server1. Or is it more advisable to place a Redis cluster over both servers?
Alternatively, I could store the Ajax data in a “Visitors” database on Server1 and not in the session. Then the internal Django tables could be located separately on both servers again.
What is the usual way to have data available on both servers?
Thanks for your advice.