Celery stopped working and I can't find good documentation

I’m having problems with celery, which I use to send emails. Particularly the code was working and from one moment to another it stopped working and I can’t find why.

Another thing I notice is that the celery documentation is not very good, so here is my first question. Is it advisable to continue with celery? Or is it better to start from scratch with another?

This is what I get when from celery:

[2022-10-08 00:02:31,138: ERROR/MainProcess] Received unregistered task of type ‘callserviceapp.tasks.send_user_mail’.
The message has been ignored and discarded.

Did you remember to import the module containing this task?
Or maybe you are using relative imports?
Please see http://bit.ly/gLye1c for more information.

The url http://bit.ly/gLye1c to get more data is not working.

And this is the code:


# This will make sure the app is always imported when

# Django starts so that shared_task will use this app.

from celery import app as celery_app

__all__ = ('celery_app',)


from __future__ import absolute_import, unicode_literals
import os

from celery import Celery
from django.conf import settings

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'callservices.settings')
app = Celery('callservices')
app.config_from_object('django.conf:settings', namespace='CELERY')

def debug_task():
    print("hi all")


from celery import shared_task
from django.core.mail import EmailMultiAlternatives, send_mail

def send_user_mail(randomNumber, email):
    subject = 'xxxxxxxx'
    body="xxxxxx: "+str(randomNumber)

    send_mail(subject, body ,'xxxxxxx@pxxxsandbox.com', [email],fail_silently = False) )
    return 1

I also add this lines in settings.py

BROKER_URL = 'amqp://'




CELERY_TIMEZONE = 'America/Los_Angeles'


CELERY_IMPORTS = ("tasks",)

That’s your call - I don’t know enough about your runtime / deployment environment to comment.

What I can say, is that we rely upon celery heavily in about 6 different projects and have yet to have any failure caused by celery. (This included one project that was handling, at peak, more than 12 million tasks / day.)

How are you running celery on your server?
Did anything happen that would cause the worker process to be restarted?
Have you tried running the worker process with a more detailed loglevel?
What are you using for amqp? RabbitMQ? If so, are you monitoring queue depth?

(Note: It seems odd to me that you’ve got the result_backend set to amqp - I don’t see that as one of the identified options for results. (See Tasks — Celery 5.2.7 documentation and Configuration and defaults — Celery 5.2.7 documentation. You may also want to see Using the Django ORM/Cache as a result backend)

Did you review the sample project as referenced in the docs to see how they show things being configured and done?

1 Like

Im using rabbitMQ

The only function which runs is def debug_task() which is defined in celery.py

Should this problem be related with
CELERY_IMPORTS = (“tasks”,) ?