memory leak in run Celery Django, Gevent strategy with Multi Worker

I used celery with rabbitmq in Django. my tasks are io-bound and I used the gevent strategy. I want to run celery tasks on a multi-process (because gevent runs on a single process). but memory grows up for no reason, without doing any tasks. what happened?
this is my celery command:

celery multi start 10 -P gevent -c 500 -A my-app -l info -E --without-gossip --without-mingle --without-heartbeat --pidfile='my-path' --logfile='my-path'

my Django celery config:

CELERY_IGNORE_RESULT = True

CELERY_WORKER_PREFETCH_MULTIPLIER = 100

CELERY_WORKER_MAX_TASKS_PER_CHILD = 400

CELERYD_TASK_SOFT_TIME_LIMIT = 60 * 60 * 12

CELERYD_TASK_TIME_LIMIT = 60 * 60 * 13

Screenshot from 2022-10-16 19-30-56

Hey there!
There can be many causes of this. Since not all settings are showing up, the most obvious one that i can think is: are DEBUG set to True?
Celery warns of this on worker startup that running with DEBUG causes a memory leak.

Thanks.
What settings do you need to know? DEBUG mode is off.

Actually i may not be the best one to give you this answer. There’s so much thing that can be involved in this issue, and i never faced this issue personally. The bellow statements are “shots on the dark”, that may lead you to more research on your problem.

Some thoughts:

  • I see that you’re starting celery using celery multi start. From the docs:

For production deployments you should be using init-scripts or a process supervision system (see Daemonization).

  • You’re starting the workers with -E flag, that is for monitoring. Are you using this feature with something like flower? If you’re not using monitoring maybe this can be unset. docs

Thanks leandrodesouzadev.
I founded problems.

hi, what was the problem ?