I have a project with 250,000 users and a traffic load of 100,000 requests per second.
The project consists of four microservices, each implemented as separate Django projects with their own Dockerfiles.
I’m currently facing challenges related to handling users and requests at this scale.
Can Django effectively handle 100,000 requests per second in this setup, or are there specific optimizations or changes I need to consider?
Additionally, should I use four separate databases for the microservices, or would it be better to use a single shared database?
If these are true requirements and a real situation, and not just some optimistic hopes for a very popular web site, then I suggest you hire some expertise capable of walking you through everything that may need to be done to address this.
Finding the best answers for you is going to involve having a much deeper understanding of the environment along with the application.
There is not going to be a standard boilerplate or cookie-cutter solution for this, as your issues are going to go far beyond what Django itself is responsible for. (You touch on this briefly with your comment regarding the database - but the issues are going to go a lot farther than that.)
1 Like