django apscheduler in docker

Hello everybody!

I’m trying to setup a cron job using django apscheduler, I have this working properly trough the command python manage.py runapscheduler, but I have the project in a docker container, and I can’t start the apscheduler with my container. Anyone who knows how to start the job in a container?
I’ve already tried in my dockerfile, but it’s not working

FROM python:3.11-alpine

WORKDIR /app

COPY requirements.txt .

RUN pip install -r requirements.txt

COPY . .

ENV DEBUG=False

Abre a porta 8000 para acessar a aplicação (ajuste conforme necessário)

EXPOSE 3000

Comando para iniciar o servidor Django

CMD [“python”, “manage.py”, “makemigrations”, “–settings=app.settings.production”]

CMD [“python”, “manage.py”, “migrate”, “–settings=app.settings.production”]

CMD [“python”, “manage.py”, “runapscheduler”, “–settings=app.settings.production”]

CMD [“python”, “manage.py”, “runserver”, “0.0.0.0:3000”, “–settings=app.settings.production”]

We don’t use django-apscheduler, we use celery, but I think the idea would be the same.

Set up a separate container to run your apscheduler process. (Our production deployments generally consist of at least 6 different containers. Celery and Celery Beat are two of them.)

1 Like

All right, thanks KenWhitesell, I’ll do that.

The ideal way to do it is have the CMD point to Django (eg gunicorn), and then use whatever container orchestration tool you prefer (docker-compose, kubernetes etc) to change the command for the scheduler container to apscheduler.

services:
    django:
        image: mycontainer:latest
    scheduler:
        image: mycontainer:latest
        command: "python manage.py runapscheduler"

The biggest benefit of this is it means you only need to build, deploy and pull the container once, and it’s just the command which changes at runtime.

Thank you friend! I worked just fines, as I have different containers working in different services, this was the best option.