is there Best Practice for used Celery in Django with async?

I implement sync API with DRF (hello_world). I used celery for my own tasks (sub_task). I want when called hello_world API view, called sub_task in the background. I don’t want to wait for a broker connection or anything. I used the apply_async function but if sub_task can’t connect to the broker or anything (such as network time out, …) occurred exceptions.
I want to call sub_task in another thread. it’s a good idea?

import threading

from celery import shared_task
from rest_framework.response import Response
from rest_framework.views import APIView

def sub_task(self):
    print("Hello Sub task")

class HelloWorldView(APIView):

    def get(self, request, *args, **kwargs):
        # ................................
        # sub_task is a celery task
        sub_thread = threading.Thread(target=sub_task.apply_async)

        # ................................

        return Response(data={})

You don’t want to start threads or processes from within your Django process. That is always a bad idea.

Django should not be in a position of trying to manage other processes - use Celery the way it’s intended to be used, as a separately run process.

thanks for your response.
are you have any idea about my problem?
I want to when HelloWorld API is called, The sub_task did not affect the response time. the result of the sub_task is not important. sub_task function is a logger into the database. when my broker is down or sub_task can’t connect to the broker or another exception should not affect the response time.

If your broker is unavailable more than a negligible amount of time in a year, then you’ve got a lot bigger set of issues regarding your architecture and deployment than just worrying about response time.

1 Like

Thanks @KenWhitesell