I built my application using Celery to run periodic scripts and asynchronous functions. I would like some tips on deploying Celery on GCP. I am using Cloud Run to host my application.
My high-level understanding is that Cloud Run requires a “web” process, so you have two options:
- Spawn celery as a separate background process, then spin up a dummy web process.
- Switch to Google Cloud Scheduler / Google Cloud Tasks
I think it’s typically not difficult to do the latter, and that is probably the “recommended” way to run background tasks on Cloud Run.
+1 to Google Cloud Tasks. I’ve recently created a Django integration, involving the following steps:
- Enqueue tasks using the
google-cloud-tasks
official client library, optionally serializing task arguments into the JSON body of the task request. - Add a view to process tasks as POST requests (receiving task arguments in the JSON request body).
- Make sure to authorize the incoming requests: Authenticating service-to-service | Cloud Run Documentation | Google Cloud.
- Optionally deploy the “worker” as as a separate Google Cloud Run service so that it can be scaled and monitored separately from the public facing web part of the Django application.
I created a separate VM to run my Celery, added my project, and executed Celery there. I connected my Google Cloud Run and my VM to the same Redis and Postgres, and it’s working. I haven’t ventured into using Google Cloud Tasks due to a lack of time and knowledge to use them, but I will definitely look into how to use them in the future. Thank you very much for your help and for the valuable tips!