Store Result of task-queue in Django Model

I want to store a task in a django model.

Up to now I am unsure if I will use django-q, django-rq or something else.

But if I have seen it correctly, it is the same for all task-queues: It is my job to store the task in the DB.

I would like to show some kind of progress-bar so that the user can see which tasks are done and which have still open.

There should be a DB row for every task, and the row should have a column where I can see the state: waiting, success, failed. I don’t need the “running” state.

I could code this myself - no question. But somehow this feels like re-inventing the wheel.

I don’t care at all for performance: There will 200 tasks once every month and the system has enough time to handle them one by one.

Should I develop this myself, or is there already a library to store tasks in a Django model?

Exception handling is important. If it fails, I want to have the stacktrace in a database column.

PS: I guess I won’t use celery since it is too big.

django-q does this by default:

You can also build a template and display all the information nicely by accessing the models and hand them over to the view

1 Like

According to the docs only successful jobs are in this table. Docs: Admin pages — Django Q 1.3.5 documentation

Yes I opened the successful tasks table - there is also a failed tasks table and a scheduled tasks table.
You can access all three of them via the ORM.

I don’t understand the reluctance to using Celery. Can you quantify what you mean about being too big?

It does provide facilities for everything you’re looking to do, including allowing you to add event handlers for custom handling of any of a variety of events, including capturing the tracebacks from failed tasks.

1 Like

Good idea to quantify this:

django-q · PyPI tgz: 58kB

celery · PyPI tgz 1.4 MB PLUS
django-celery · PyPI 86kB

→ django-celery is roughly 30x bigger than django-q.

My application runs on a super cheap VPS (3 Euro per month). Up to now it works fine.

Apples to oranges comparison. The celery source (which is what you’re quoting for the file size from the tar) includes all the documentation and supporting “stuff” (my not-so-technical term for it). You don’t need to install that to use it.

I would submit that most people are going to install the wheel, which is ~ 400K, whereas the django-q wheel is ~ 70K. (That’s what you’re downloading with a pip install.)

(Also, django-celery is no longer needed as an integration package. All the functionality provided by it is now included in the base package.)

Still larger, but I would also submit that given the different degrees of functionality provided, not tremendously so. (I can envision situations where 300K of storage can make a difference, but if you’re in that type of situation, I’d be wondering what you’re doing with Django in that environment to begin with.)

This is not to disparage django-q - it’s a fine package that works extremely well for a lot of people, and there are some good reasons to select django-q over celery. If it’s going to do everything you need it to do, great! I would never discourage anyone from selecting it as their queue-of-choice.

However, I also fail to see where package size - in the amounts we’re talking about here - would be a valid criteria for this decision under any reasonable set of circumstances - and I’m running Django with Celery on some pretty limited gear in the Raspberry Pi class of SBCs, so I’m familiar with working in some limited environments.

I compared how long it takes to import celery and rq.

rq takes three times longer. Rule of thumb: Don’t follow your gut feeling, instead: measure.