pydantic version: 1.5.1
pydantic compiled: True
install path:
python version: 3.8.2 (default, May 6 2020, 02:49:43) [Clang 4.0.1 (tags/RELEASE_401/final)]
platform: macOS-10.15.4-x86_64-i386-64bit
optional deps. installed: ['typing-extensions', 'email-validator']
from pydantic import BaseModel
class MyModel(BaseModel):
foo: Decimal
m = MyModel(foo="31231239131093781902381093810298310983123.12000000")
m.json() # {"foo": 3.1231239131093784e+40} is lossy
Replace the current Decimal: float JSON encoder [1] with Decimal: str.
[1] https://github.com/samuelcolvin/pydantic/blob/master/pydantic/json.py#L27
I think we used to have it like that but changed it since the current solutions is most user-friendly in most scenarios.
You can change this yourself on a model (and use a custom base model to extend the functionality to all your models):
from decimal import Decimal
from devtools import debug
from pydantic import BaseModel
class MyModel(BaseModel):
foo: Decimal
class Config:
json_encoders = {Decimal: str}
m = MyModel(foo='31231239131093781902381093810298310983123.12000000')
debug(m.json())
"""
> test.py:14 <module>
m.json(): '{"foo": "31231239131093781902381093810298310983123.12000000"}' (str) len=61
"""
If I may challenge that decision: the primary use case for using Decimals is that you need to store and work with arbitrary precision decimals. Serialising these as floats will cause loss of information. If you don't care about storing all digits, then you might as well use float instead of Decimal.
In other words: if you all you need is convenience, we already have float. If you need arbitrary precision, you need Decimal. It would somewhat disappointing that users that need arbitrary precision also need a workaround to actually store those Decimals without loss of precision.
I agree with @lsorber here, the whole idea of using an uncommon datatype like Decimal is to get full precision like when dealing with monetary calculations. So having the default encoding be a JSON Number defeats that purpose.
Using Decimal is already a hassle, one that you even have to pass off to people consuming your API. I expect a convenient library like pydantic to not make my life harder here.
The solution suggested with the Config class is less elegant when using dataclass. I'm contemplating just putting ENCODERS_BY_TYPE[Decimal] = str somewhere in my code and consider this solved for all models and dataclasses.
Note: Django REST Framework's Serializers by default converts Decimal to str, and it's a default that I was glad to have, because back then I wasn't sure whether the JSON Number type would preserve the precision or not.
@samuelcolvin would you consider reopening this discussion, or otherwise providing some insight as to why the current behaviour is the desired behaviour?
As it stands, there is little value in using Decimal fields because they are encoded to float, in which case you might as well annotate those fields as float to begin with.
Additionally, the workaround is somewhat fragile in that you'd need to be diligent about including it in every model that has Decimal fields, which can be non-obvious when subclassing models with custom Configs for example.
As it stands, there is little value in using Decimal fields
Even if we accept your point, this is incorrect since many people are not using JSON.
The work around is not fragile if you make consistent use of a custom base class.
I did it this way because I think floats are the most obvious analogue for decimals in JSON. In most situations this "just works".
All that said, I'll consider changing this in V2 if no one has any powerful counter arguments.
Most helpful comment
If I may challenge that decision: the primary use case for using Decimals is that you need to store and work with arbitrary precision decimals. Serialising these as floats will cause loss of information. If you don't care about storing all digits, then you might as well use
floatinstead ofDecimal.In other words: if you all you need is convenience, we already have
float. If you need arbitrary precision, you needDecimal. It would somewhat disappointing that users that need arbitrary precision also need a workaround to actually store those Decimals without loss of precision.