Celery – Limit of task_kwargs column


Does anyone know if there is a de-facto limit to the length of string that can be stored in the task_kwargs column of the TaskResult table? My understanding is that the column is an nvarchar(max) and should theoretically have a length of up to ~2 billion characters.

We’re using the TaskResult table to store task calls that will need to be picked up later by a periodic beat task. These tasks originate as service calls. We’ve observed that, in cases where the passed JSON is very large, the string representation of the JSON that is stored in task_kwargs is “abbreviated” (as opposed to truncated).


Instead of task_kwargs being like this:

“{‘key1’: ‘value1’, <many key/value pairs>, ’keyN’: ’valueN‘}”

It looks like this:

“{‘key1’: ‘value1’, <many key/value pairs>, ’keyX’: ’valueX‘, ‘key…’,…}” (It literally contains the “…” before the curly brace)

Is this intentional and we need to re-think our approach?


That’s very much a database-specific question - you’d have to go check with your database vendor (which, given you mentioned nvarchar, is probably Microsoft?).

In addition, Celery isn’t Django, so you may get better answers from a Celery-based community :slight_smile:

1 Like

If you’re storing large amounts of data, I’d suggest utilizing celery’s compression. There are a number of supported libraries for v5.

That said, I suspect your hitting an issue similar to the one here. If true, that means Celery for you is calling saferepr with a limit.

1 Like

Thanks Tim! saferepr is being called with a limit of 1024. Since that value doesn’t appear to be configurable, we’re looking to refactor.

Thanks again!