Django and Celery. My little headache

G’day All,

I’ve been running a bit of a dirty hack to have celery running in my Django project’s docker container.

I’m currently playing with celerybeat and setting up its script in /etc/init.d/celerybeat and its configuration file in /etc/default/celerybeat but it is giving me a wee headache.

I’ve two questions which perhaps people can answer/share experiences to one or both of the questions:

  1. If you’re running Celery and Django today, how are you running and deploying it?
  2. If you’re running Celery with init.d, would you happen to know why I’m getting this exception even when my CELERYBEAT_CHDIR is correct, as is my DJANGO_SETTINGS_MODULE in my Celery configuration file. I’m getting the much deceiving error message

django.core.exceptions.ImproperlyConfigured: The SECRET_KEY setting must not be empty. which tells me that Celery can’t find my Django settings. I could of course be wrong. It has happened before.



Current usage - we use supervisord to run uwsgi (Django), Celery, and CeleryBeat. The new system we’re working on runs each of these in separate docker containers.

The only time I’ve ever seen that error message is when Celery can’t find the Django settings. For us, it has usually been 1 of 2 causes, either not setting the environment variable in the Celery script, or running it as a user that doesn’t have permissions to the settings file.

Something I’ve done in the past to just get a quick idea about things is throw a print statement or 2 in my file. If the output isn’t generated, it’s not finding the file. If it is generated, then the file has been found and the problem is elsewhere. (supervisord captures stdout and stderr, making this type of thing easy to find.)

Sorry I couldn’t be of more help.


We do not use Celery anymore - we using cloud provider managed queues (Google Cloud Tasks) in particular.

But, when on previous projects we used supervisor.d.
If I remember correctly, you have make sure you path is correct.
We had a line like this in our surpervisor config file:


Hi Ken, sww314,

Thank you both for your replies. As always, people here are often very generous with their time and answers.

So, my problem was a dumb problem and simply a misconfiguration in my celerybeat configuration. I had DJANGO_SETTINGS_MODULE="config"
instead of

After jumping over that hurdle, I managed to fire up celery in my Django container without fuss using init.d.

After reading Ken’s post, I was curious about how Django in one container could use Celery in another container. I admit, I still do not know the answer to this, and I’ll have to dig deeper, but I did find this cheat sheet for Dokku (which I use) which showed how to use the Procfile to launch other containers for use in the same codebase.

The link which provided the answer is

In essence, if I add the following to my procfile, Dokku will fire up Docker instances of Celery Beat and Celery Workers. From what I understand it is better practice, and something Ken alluded to, to have a single service running in a docker container rather than trying to fit multiples services in a single Docker container.

If one searches Google or Duck Duck Go for help for Django + Celery + Docker, you’ll find many articles showing how it is done with Docker Compose, but none addressing the concept of deployment. The above link addresses deployment, but only with Dokku

Should someone stumble across this post looking for firing up Celery with Django in a single container, then adding the following in the Docker file and the following celerybeat file should help them on their way.

Dockerfile additions for celery beat

COPY misc/celery/celerybeat-init /etc/init.d/celerybeat
COPY misc/celery/celerybeat /etc/default/celerybeat
COPY misc/celery/celerybeat /etc/default/celeryd
RUN chmod 755 /etc/default/celerybeat
RUN chmod 755 /etc/init.d/celerybeat
RUN adduser --disabled-password --gecos "" celery

Celerybeat config file

Be sure to replace my config.settings with your project name and settings file.

# Absolute or relative path to the 'celery' command:

# App instance to use
# comment out this line if you don't use an app
export DJANGO_SETTINGS_MODULE="config.settings"
# Where to chdir at start.

# Extra arguments to celerybeat
1 Like


I can tell you how we do it, YMMV.

Yes, we docker compose. In addition to the Django, Celery, and Celery Beat instances, we have Postgresql, Redis, and nginx instances running as part of the same bundle. We have a “code” volume shared by all the instances containing the project, along with a couple other volumes shared among multiple instances. Communication between instances (Django/Celery/Beat) is through the common redis instance.

We deploy and run on internal equipment, so deployment for us is just a matter of copying the containers from one host to another.

1 Like

Interesting Ken. I didn’t really think Docker Compose was for deployment, but it makes complete sense if you are copying your containers to whatever system you use to run them.

Can I ask which system/tools you use to deploy your containers? I was looking at Kubernetes, and whilst it has some really nice features, I feel it is a bit too time consuming to build the entire system at this stage in my project. If it starts to take off, I’ll have more of an excuse to use the time necessary to build it to an acceptable level of redundancy and performance for production.



We’re not a large system / site, these are internal systems I’m talking about.
We don’t do any CI/CD. Everything is manual. (I mean, we do have scripts that simplify the process, but everything is initiated and verified manually.)


1 Like

As always Ken, thank you.