How set up second Celery instance for second unrelated Django web app on the same server?

I successfully installed Celery for one Django web app simply by adding a tasks.py file
and starting it with “python3 -m celery -A tasks worker -l INFO”.

I want to start another Celery instance for another separate unrelated Django web app
on the same server….

I presume the second tasks.py won’t need any special adjustments. However,
do I need to adjust how I start the second Celery instance? I presume
I cannot simply do the same command again (“python3 -m celery -A tasks worker -l INFO”)
but I must perhaps specify a new port or something different?

Thanks,

Chris

Well, you can set up as many instances as you wish (and the host machine is able to handle).

Yeah, you’re on the right track. If you think about it, you’re running this command from a working directory that most likely is inside a django project directory (the one with manage.py). You’ll need to the same or something similar on the other project you want to run.

You won’t need any other port, since celery is not a web server, and won’t attach any bindings to a port. What you’ll need is probably a different broker (the one that celery connects to receive and dispatch tasks) from what you’re using on the first project.

Hope that helps.

This actually depends upon what broker you’re using.

Celery, as a task, doesn’t listen on any ports, but the brokers do.

If you’re using Redis, you will either need to specify different database if the two Celery instances are going to share the same Redis instance. Or, if you want each Celery instance to use its own Redis instance, then those two Redis instances need to listen on different ports.

Similar requirement are necessary if you’re using RabbitMQ as the broker.

Ken

Thanks. I’m using RabbitMQ. When you say database, are you referring to the regular model databases of each Django app?

If every separate Django app has a different database then can I just do the same command “python3 -m celery -A tasks worker -l INFO” for each separate Django app and it’ll sort everything out?

Not in this context, no.

When you are using Redis, it uses the term “database” to identify different areas of storage. They’re numbered (by default) 0 - 15, with 0 being the default database being used.

If you’re sharing Redis across multiple Django projects, you can isolate the activity among each of those project by assigning a different Redis database to each project.

So for example, in the case of Celery, you might have the following setting in your project:

CELERY_BROKER_URL = 'redis://localhost:6379/0'
                This is the database number ^

Your second project might then have:

CELERY_BROKER_URL = 'redis://localhost:6379/1'

to define using a different database.

Using different databases (as is PostgreSQL) is a separate topic, and not directly related to Celery. Each Django project should already be using a separate database.

I haven’t used RabbitMQ for more than 7 years now, so I’m not up-to-date with how it should be configured - but I’m sure there’s something in the configuration allowing you to segregate traffic among the projects.

Ken

So far this seems to be working…It appears you can start an unlimited
number of Celery instances for different Django apps all with the command

“python3 -m celery -A tasks worker -l INFO”

As long as you add the following in each settings.py with proper adjustments!?..

CELERY_DEFAULT_QUEUE = “app_name”
CELERY_QUEUES = (kombu.Queue(“app_name”,
kombu.Exchange(“app_name”),
routing_key = “app_name”),)