There are several cases where it be useful to be able to return a Python dictionary that has already passed through the JSON encoders so that it can be easily added to a REST response or similar.
from datetime import datetime
from pydantic import BaseModel
import json
class User(BaseModel):
id: int
signup_ts: datetime = datetime.now()
u1 = User(id=1)
u2 = User(id=2)
ret = {"users": [u1.dict(), u2.dict()]}
json.dumps(ret) # Failure due to datetime object
# Potential feature
ret = {"users": [u1.json(as_dict=True), u2.json(as_dict=True)]}
json.dumps(ret)
I do understand the use case for pydantic would be more to have a response that is an entire pydantic object which would recursively serialize itself to JSON correctly, but we have still found several cases where this would be useful. Happy to add this feature ourselves.
There was a PR for this before: https://github.com/samuelcolvin/pydantic/pull/317
I think the main point of rejection was:
End users are likely to want to customise this kind of thing a lot
As I needed a jsonable_encoder for FastAPI, to receive Pydantic models and convert them to Python JSONable data automatically, I created a custom version (that I kept updating).
You might find it useful as a workaround, it's here: https://github.com/tiangolo/fastapi/blob/master/fastapi/encoders.py#L9
Funny enough, I have ended up installing FastAPI in other non-FastAPI projects (Flask-based projects that I haven't migrated) just to use that jsonable_encoder.
I plan on making it also compatible with SQLAlchemy models and others (without depending on them being installed). It's already compatible with objects that can be converted with dict(obj) and data from JSON based DBs like Couchbase.
As the jsonable_encoder in FastAPI is a bit more generic (it supports other types too, not only Pydantic), I guess it wouldn't make sense to move it here as is.
But if @samuelcolvin thinks that's something that might now seem suitable for Pydantic, I'm happy to add it in a PR, or a slimmed down version of it.
Ah, missed that PR and thanks for the workaround. Seems much better than our current hack of the very simple: json.loads(model.json()). Isn't this customization analogous to adding custom encoders to Config.json_encoders and patching as desired with model.json(encoder=...)?
Initial thought is that your implementation may be slow due to if isinstance statements rather than a encoder[type(obj)](obj) like implementation. It looks like json.dumps implements its type lookup in C, so unlikely to catch them there without JIT/cython.
Our current implementation works well and I do see the downsides, so a bit agnostic about adding this in or not. Happy to help on the PR part if this is something desired.
Ugly though it is I doubt anything will catch ujson.loads(model.json()).
Since:
We should leave it out of pydantic.
But since this is the second independent question about this, maybe we should add a section to documentation?
replaced by #951
Most helpful comment
There was a PR for this before: https://github.com/samuelcolvin/pydantic/pull/317
I think the main point of rejection was:
As I needed a
jsonable_encoderfor FastAPI, to receive Pydantic models and convert them to Python JSONable data automatically, I created a custom version (that I kept updating).You might find it useful as a workaround, it's here: https://github.com/tiangolo/fastapi/blob/master/fastapi/encoders.py#L9
Funny enough, I have ended up installing FastAPI in other non-FastAPI projects (Flask-based projects that I haven't migrated) just to use that
jsonable_encoder.I plan on making it also compatible with SQLAlchemy models and others (without depending on them being installed). It's already compatible with objects that can be converted with
dict(obj)and data from JSON based DBs like Couchbase.As the
jsonable_encoderin FastAPI is a bit more generic (it supports other types too, not only Pydantic), I guess it wouldn't make sense to move it here as is.But if @samuelcolvin thinks that's something that might now seem suitable for Pydantic, I'm happy to add it in a PR, or a slimmed down version of it.