I used celery with rabbitmq in Django. my tasks are io-bound and I used the gevent strategy. I want to run celery tasks on a multi-process (because gevent runs on a single process). but memory grows up for no reason, without doing any tasks. what happened?
this is my celery command:
celery multi start 10 -P gevent -c 500 -A my-app -l info -E --without-gossip --without-mingle --without-heartbeat --pidfile='my-path' --logfile='my-path'
Hey there!
There can be many causes of this. Since not all settings are showing up, the most obvious one that i can think is: are DEBUG set to True?
Celery warns of this on worker startup that running with DEBUG causes a memory leak.
Actually i may not be the best one to give you this answer. There’s so much thing that can be involved in this issue, and i never faced this issue personally. The bellow statements are “shots on the dark”, that may lead you to more research on your problem.
Some thoughts:
I see that you’re starting celery using celery multi start. From the docs:
For production deployments you should be using init-scripts or a process supervision system (see Daemonization).
You’re starting the workers with -E flag, that is for monitoring. Are you using this feature with something like flower? If you’re not using monitoring maybe this can be unset. docs