One request at a time from user

Hi! I’m making a project in Django. I would like to lock the user to make simultaneous requests. I’m making calls to other APIs and I’m afraid that if the user requests are made simultaneously then it could make more API calls than needed, costing me more. What’s the best tool for this? My first idea was defining a field in the user table that sets a flag that represents when the server is busy processing a user request, but I don’t think this is the best idea. I think this could be made from the middleware but I’m lost there. Should I use the server cache? Should I use postgresql locks? I don’t know what to do. Greetings.

Assuming all Django instances are sharing the same cache, this may be your best bet.

This has the advantage that you can set an expiration on the cache key, so that if the cache isn’t released by the other process for some reason, it will still expire in a reasonable time.

1 Like

Thanks, I’ll try and post the cache approach if I’m successful. Sorry if this sounds too dumb but, is it a bad idea to define a boolean flag in the user model to lock when he’s making a request (either modifying directly or defining a middleware) and unlocking it when the request finishes? From what I’ve understood, the cache is also stored in the database. What’s the difference between simply defining a flag and using the cache for this problem?
Also, you mentioned ‘Assuming all Django instances are sharing the same cache’. Would that assumption be violated if I’m running the application on a cloud load balancer?

It can be, but shouldn’t be. See your options at Django’s cache framework | Django documentation | Django

Possibly. It depends upon how the cache is configured.

1 Like

Here’s the code that i made for the middleware in order to lock the user:

class RequestFlagMiddleware:
    def __init__(self, get_response):
        self.get_response = get_response

    def __call__(self, request):
        user_id = request.user.id if request.user.is_authenticated else None
        
        # Check if the user is currently making a request
        if user_id and cache.get(f'user_request_{user_id}_in_progress'):
            return JsonResponse({'message': 'Another request is already in progress'}, status=409)  # Conflict

        # Set the flag indicating that the user is making a request
        if user_id:
            cache.set(f'user_request_{user_id}_in_progress', True, timeout = 300)

        response = self.get_response(request)

        # Clear the flag after processing the request
        if user_id:
            cache.delete(f'user_request_{user_id}_in_progress')

        return response

This is made for sync views. I’m in the process of understanding how the middleware works with async. Thank you very much Ken for the effort put in this forum, it’s appreciated.
Greetings.