While trying to manipulate some data using the json_script template filter, I stumbled upon an especially unpleasant to debug situation : after parsing the generated JSON in JS and trying to compare Decimal values, I noticed they were encoded as strings, which caused "10"<"2" to be true. It took me a while to notice that they were strings, and wrapping the values in float(...) before adding them to the encoded dictionnary solved the issue.
Digging into Django’s code base, I found the following lines which make me think this has been done on purpose.
Does anyone know the original reasoning behind this behavior ? Should I open a ticket for this to be changed in a future major version ?
The decimal module provides support for fast correctly rounded decimal floating-point arithmetic. It offers several advantages over the float datatype: […]
Representing a Decimal as a float would cause it to lose precision, which is the reason that class/module exists.
The only way to keep precision and to represent a Decimal in JSON is to convert it to a string. It is up to you to convert it back to Decimal, because it is not a native JSON type.
The reason that DjangoJSONEncoder exists is to support serialization of more field types, as Python’s built-in serializer will error for Decimals, dates, and other types common to Django.
If you load an arbitrary Decimal value as a JavaScript number, it can lose data, as JS only supports floating point numbers, which have much less precision. It’s possible to load a decimal library in JS to retain precision. Otherwise, if you can live with the imprecision of floats, you can try JS numbers.