Celery does not register tasks

So I have followed the these celery docs to prepare my django project: First steps with Django — Celery 5.2.7 documentation

I have added these files to my project:

curated/landscape/celery.py:

from __future__ import absolute_import, unicode_literals
from celery import Celery
import os

# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'landscape.settings')
app = Celery('landscape')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django apps.
app.autodiscover_tasks()

app.task(bind=True, ignore_result=True)
def debug_task(self):
    print(f'Request: {self.request!r}')

I have edited the curated/landscape/__init__.py:

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)

and added a task to my app here curated/landscape/self_hosted/tasks.py

from celery import shared_task

# app = Celery('tasks', broker='pyamqp://guest@localhost:9001//')

@shared_task
def add(x, y):
    return x + y

I have a rabbitmq running which also works since I have the following output from my celery worker:

celery -A landscape worker -l INFO
[2022-06-30 14:48:36,670: WARNING/MainProcess] No hostname was supplied. Reverting to default 'localhost'
 
 -------------- celery@devbox v5.2.7 (dawn-chorus)
--- ***** ----- 
-- ******* ---- Linux-5.15.0-40-generic-x86_64-with-glibc2.35 2022-06-30 14:48:36
- *** --- * --- 
- ** ---------- [config]
- ** ---------- .> app:         landscape:0x7f0b3861f160
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 16 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery
                

[tasks]


[2022-06-30 14:48:36,889: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2022-06-30 14:48:36,890: INFO/MainProcess] mingle: searching for neighbors
[2022-06-30 14:48:36,893: WARNING/MainProcess] No hostname was supplied. Reverting to default 'localhost'
[2022-06-30 14:48:37,900: INFO/MainProcess] mingle: all alone
[2022-06-30 14:48:37,912: INFO/MainProcess] celery@devbox ready.

You can see that I have no tasks registered in celery which is weird.

I try to add the task manually:

python manage.py shell
from self_hosted.tasks import add
add.delay(4,4)

No hostname was supplied. Reverting to default 'localhost'
<AsyncResult: 610baac9-1c72-42d9-8ae0-8762684e325f>

but then I get the following output from the celery worker:

[2022-06-30 14:49:45,964: ERROR/MainProcess] Received unregistered task of type 'self_hosted.tasks.add'.
The message has been ignored and discarded.

Did you remember to import the module containing this task?
Or maybe you're using relative imports?

Please see
http://docs.celeryq.org/en/latest/internals/protocol.html
for more information.

The full contents of the message body was:
'[[4, 4], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (81b)

Thw full contents of the message headers:
{'lang': 'py', 'task': 'self_hosted.tasks.add', 'id': '610baac9-1c72-42d9-8ae0-8762684e325f', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'group_index': None, 'retries': 0, 'timelimit': [None, None], 'root_id': '610baac9-1c72-42d9-8ae0-8762684e325f', 'parent_id': None, 'argsrepr': '(4, 4)', 'kwargsrepr': '{}', 'origin': 'gen1013307@devbox', 'ignore_result': False}

The delivery info for this task is:
{'consumer_tag': 'None4', 'delivery_tag': 1, 'redelivered': False, 'exchange': '', 'routing_key': 'celery'}
Traceback (most recent call last):
  File "/home/andrej/.pyenv/versions/3.10.5/envs/curated/lib/python3.10/site-packages/celery/worker/consumer/consumer.py", line 591, in on_task_received
    strategy = strategies[type_]
KeyError: 'self_hosted.tasks.add'

I think I followed through with the docs but I am missing something.
If it is any help here is the repo with the current commit: https://github.com/ajfriesen/curated/tree/4a0f33233fdf1d4e493dacbe66569fad1275cbba

Maybe it is because of my settings module but I am kind of lost now and wasted enough time and hope someone can push me in the right direction :slight_smile:

I don’t see where you have added landscape to your INSTALLED_APPS setting.

Do I need to?
The landscape thing is a leftover from a rename.
Before everything was called landscape/landscape

Now it’s curated/landscape

Need to move it curated/curated.

Even with 'landscape', in the INSTALLED_APPS list it does not work.

Generally speaking (there may be an exception or two), apps need to be in INSTALLED_APPS if Django is going to import code from them, or in this case, Celery.

So you’re saying that with landscape in INSTALLED_APPS, you are still not seeing the debug_task in the list of tasks when you run celery?

yes, the output with landscape in the list of INSTALLED_APPS:

celery -A landscape worker -l INFO
[2022-07-01 17:54:25,349: WARNING/MainProcess] No hostname was supplied. Reverting to default 'localhost'
 
 -------------- celery@starlite v5.2.7 (dawn-chorus)
--- ***** ----- 
-- ******* ---- Linux-5.15.0-40-generic-x86_64-with-glibc2.35 2022-07-01 17:54:25
- *** --- * --- 
- ** ---------- [config]
- ** ---------- .> app:         landscape:0x7fb51a2230a0
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery
                

[tasks]


[2022-07-01 17:54:25,578: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2022-07-01 17:54:25,585: INFO/MainProcess] mingle: searching for neighbors
[2022-07-01 17:54:25,598: WARNING/MainProcess] No hostname was supplied. Reverting to default 'localhost'
[2022-07-01 17:54:26,624: INFO/MainProcess] mingle: all alone
[2022-07-01 17:54:26,657: INFO/MainProcess] celery@starlite ready.

That’s interesting because I downloaded your repo, made a number of adjustments to your settings (primarily the database settings, but also I use redis as the broker and not rabbitmq) and this is what happens for me …

(c40) curated tskww$ celery -A landscape worker -l INFO
/Users/tskww/git
 
 -------------- celery@Kendalls-MacBook-Air-2.local v5.2.7 (dawn-chorus)
--- ***** ----- 
-- ******* ---- macOS-11.6.7-x86_64-i386-64bit 2022-07-02 00:26:46
- *** --- * --- 
- ** ---------- [config]
- ** ---------- .> app:         landscape:0x1118aa8c0
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:     
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery
                

[tasks]
  . self_hosted.tasks.add

[2022-07-02 00:26:46,572: INFO/MainProcess] Connected to redis://localhost:6379/0
[2022-07-02 00:26:46,581: INFO/MainProcess] mingle: searching for neighbors
[2022-07-02 00:26:47,600: INFO/MainProcess] mingle: all alone
[2022-07-02 00:26:47,616: WARNING/MainProcess] /Users/tskww/virtualenvs/c40/lib/python3.10/site-packages/celery/fixups/django.py:203: UserWarning: Using settings.DEBUG leads to a memory
            leak, never use this setting in production environments!
  warnings.warn('''Using settings.DEBUG leads to a memory

[2022-07-02 00:26:47,616: INFO/MainProcess] celery@Kendalls-MacBook-Air-2.local ready.

Interesting🤔

I did not set up any authentic things for rabbitmq, maybe that’s the error.
But the message did not let me think of this.

I just took rabbitmq because I have read somewhere that redis is not stable for celery. Seems this info was outdated.

I do not have a preference and therefore will try redis as well.

Still I am curious what the problem is.
I guess the Django setup with celery is correct then since redis is picking up the tasks when you tried.

Would you mind sharing your changes to my repo with redis?

I think I have a problem with the settings in grneral since I do not get it to work with redis either by playing around.

A diff would be fine.

I think I found it.

I was preparing a diff when I noticed…

your wsgi.py file has:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'landscape.settings.prod')

your celery.py file has:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'landscape.settings')

which in your current structure is a module but not a settings file. (Or at least, your __init__.py file doesn’t import a set of settings allowing it to be used as a module directly.)

If I only change the celery.py file to read:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'landscape.settings.prod')
then the task is registered.

I didn’t notice this last night, because what I had done was copy your settings/local.py file to settings.py to make my edits.

2 Likes

Oh my that might be it.

I was wondering why I can to change the port or anything for redis and rabbitmq.

I might not have time for testing today, bit will look into it tomorrow.

Thank you so far. Will come back here and report when I have it up running.


Edit:

This was it.

Thank you very much for your time and help!