I've pored over the documentation and issue and can't seem to find an example for serializing members other than class attributes.
class Rectangle(BaseModel):
width: bool
length: int
@property
def area(self) -> int:
return self.width * self.length
r = Rectangle(width=10, length=5)
print(r.json())
# What I'd like to see:
{ "width": 10, "length": 5, "area": 50 }
I'd also think that properties without setters would default to readOnly on the schema side as well.
It did seem there was a way to get something close to what I'm after via ORM mode and masking a property on the ORM model, but I'd hate to need to create a class just to contain properties.
Going over BaseModel._calculate_keys it seems BaseModel will not yield what I'm after.
Still, I'm hoping that it's just been a long day and I overlooked something :)
I think the only way you could serialize properties now would be to override the serialization-related methods to manually inject the property values. I don't think that should be hard, but I'm not sure. It's probably worth documenting if it is straightforward.
In general, it would be tough to do this "automatically" because properties are "stored" very differently than the model data. And for many people it would be a breaking change.
I think it could make sense to add support for this via config, but only if it didn't add more checks to the serialization process (which I think it might have to).
I wonder if implementing something like the schema/field as a decorator would be a good approach?
class Rectangle(BaseModel):
width: bool
length: int
@property
@field(title='Area')
def area(self) -> int:
return self.width * self.length
And on init, the internal fields dict could grab attributes that have the decorator and act accordingly.
I'd say the property above would set to readOnly on the schema output.
Could possibly used on methods to support writeOnly ?
I'd be happy to get a PR going with some guidance if this feature is desired by others (esp the property serializtion, not so much the method)
Happy to review a PR, hard to know how clean/performant it would be until someone tries it.
I think this is a really good idea, and like your proposed api @bradodarb, but adding as few checks as possible for non-users of the feature would be critical because of how the serialization functions are called recursively on each item in lists/dicts.
I was also looking for the serialization part of the @property fields
We do this in some of our models by overriding the dict method and injecting the properties to the resulting dict. I think the @field decorator is a really good idea to solve this in the framework in a stable way.
We do this in some of our models by overriding the dict method and injecting the properties to the resulting dict. I think the @field decorator is a really good idea to solve this in the framework in a stable way.
Example on how this can be done by overriding dict()
@classmethod
def get_properties(cls):
return [prop for prop in cls.__dict__ if isinstance(cls.__dict__[prop], property)]
def dict(
self,
*,
include: Union['AbstractSetIntStr', 'DictIntStrAny'] = None,
exclude: Union['AbstractSetIntStr', 'DictIntStrAny'] = None,
by_alias: bool = False,
skip_defaults: bool = None,
exclude_unset: bool = False,
exclude_defaults: bool = False,
) -> Dict[str, Any]:
"""Override the dict function to include our properties"""
attribs = super().dict(
include=include,
exclude=exclude,
by_alias=by_alias,
skip_defaults=skip_defaults,
exclude_unset=exclude_unset,
exclude_defaults=exclude_defaults
)
props = self.get_properties()
# Include and exclude properties
if include:
props = [prop for prop in props if prop in include]
if exclude:
props = [prop for prop in props if prop not in exclude]
# Update the attribute dict with the properties
if props:
attribs.update({prop: getattr(self, prop) for prop in props})
return attribs
Interesting feature, I was looking for it. Is there a PR for review/testing? Thank you
Not yet.
I think we still need to decide on exactly how it would work.
Questions:
computed_field, model_property?title, in_schema=True, alias?__dict__ but that would mean it wasn't updated on setting an attribute. maybe a cache=False argument to the decorator.What if instead of decorators we use a default value? A bit similar to the field factory from dataclass.
Ex:
class Rectangle(BaseModel):
width: int
length: int
area: int = property_value('area')
@property
def area(self) -> int:
return self.width * self.length
We could even add a default validator for those properties that block setting the attribute value
I think that's uglier than the decorator approach outlined above.
Regarding the cache question. I'd go for no caching and use cached_property (from 3.8) if needed be. People in 3.7 would need to handle it manually, but it'd be pretty straightforward in future python versions.
Regarding mypy, the easiest way to deal with it would probably be to leverage the @property or @cached_property decorators, at least to "trick" mypy, if not to make direct use of them.
One possible approach -- I have no idea if the following would work, but something like:
# actual return type may or may not be the `property` type
def pydantic_property(*args, ** kwargs) -> Type[property]
...
Hopefully this would mean mypy would treat @pydantic_property(...) as equivalent to @property.
Another version to consider would be:
F = TypeVar("F")
def pydantic_property(...) -> Callable[[F], F]:
...
@pydantic_property(...)
@property
def blah(...)
...
This would allow the use of whichever property decorator you wanted (property, functools.cached_property in 3.8, cached_property.cached_property in 3.7, etc.), and I think would be fully mypy compatible.
I personally think it would be best to support both cached and un-cached versions, and it would be great if there was a way to leverage existing decorators so we didn't have to maintain that behavior ourselves.
If we want to move forward with this, I'm happy to try to look into the above more carefully.
Based on @MartinWallgren s solution I currently use:
class PropertyBaseModel(BaseModel):
"""
Workaround for serializing properties with pydantic until
https://github.com/samuelcolvin/pydantic/issues/935
is solved
"""
@classmethod
def get_properties(cls):
return [prop for prop in dir(cls) if isinstance(getattr(cls, prop), property) and prop not in ("__values__", "fields")]
def dict(
self,
*,
include: Union['AbstractSetIntStr', 'MappingIntStrAny'] = None,
exclude: Union['AbstractSetIntStr', 'MappingIntStrAny'] = None,
by_alias: bool = False,
skip_defaults: bool = None,
exclude_unset: bool = False,
exclude_defaults: bool = False,
exclude_none: bool = False,
) -> 'DictStrAny':
attribs = super().dict(
include=include,
exclude=exclude,
by_alias=by_alias,
skip_defaults=skip_defaults,
exclude_unset=exclude_unset,
exclude_defaults=exclude_defaults,
exclude_none=exclude_none
)
props = self.get_properties()
# Include and exclude properties
if include:
props = [prop for prop in props if prop in include]
if exclude:
props = [prop for prop in props if prop not in exclude]
# Update the attribute dict with the properties
if props:
attribs.update({prop: getattr(self, prop) for prop in props})
return attribs
What is the current status of this feature request, any ETA?
Is the proposed way forward to use the solution from @ludwig-weiss until this request is resolved?
Hello,
I am relatively new to pydantic, but I am wondering if it would be legitimate to extend @ludwig-weiss method by adding a possibility for @my_property.setter and @my_property.deleter. I tried it but somehow failed because I am not too deep into the builtins setter method and not exactly sure how it works in detail. But I'd really like this feature because in that case one could also access and private variables in a secure way over properties. Or did I miss something here. Maybe pydantic is also not meant to do all these things but would really help to ensure valid classes not only for data-only-classes but also for classes implementing some more logic. I just like the parsing and validation functions pydantic brings but for classes including more logic I still cannot find the best way to make use of pydantic. Having an additional pydantic model for each class does seem too much boilerplate to me.
Please correct, if I am totally wrong here.
Whats the current Status on this Issue. Any timeline when it get's be done?
I agree with most that has been said, but one thing I dislike about the enforced decorator syntax is that you lose the amazing code locality that is had by all fields being lined up directly underneath one another. Whenever I read other people's code and see them using dataclasses, I'm a happy hippo because I know the data type will be easy to read and understand with the types themselves constituing the documentation.
Now if setter properties get introduced for serialization, it automatically forces you to scan the entire class to find the props when you're looking to construct or work with this object. It also can't be copy-pasted around anymore without special care.
Now I'm totally aware that type checkers are probably going to detect such issues fairly quickly, but I'd just like to propose an alternative (but not mutually exclusive), more compact syntax based on anonymous functions just to have exhausted all the options.
class Rectangle(BaseModel):
width: int
length: int
area: int = ComputedProperty(lambda s: s.width * s.length)
I'm not sure if something like the above would completely go against Pydantic's design philosophy, but with its inherent advantage of producing extremely compact model definitions out of the box, this would absolutely fall in that spectrum (especially in combination with something like the lambdas library once it supports getattr access).
If the goal is to be able to have private backing fields (which I suppose is the case for many uses of decorators), maybe something like this can work?
class Triangle(BaseModel):
a: int
b: int
c: int = CachedProperty(lambda s: pow(s.a, 2)+ pow(s.b, 2))
I guess my point is to question why we need the bloat of an additional decorator, drawing us further and further away from regular POPOs... 馃槩
P.S.: FastAPI with its dependency injection "framework" has already proven that things that previously were unthinkable in Python can become absolutely mainstream, so I didn't want to categorically reject something like @jsoucheiron's comment without giving it some more thought.
In addition, I was wondering whether setter properties would be included in from_orm and from_obj (by default) aswell?
Any update on this?
Based on @MartinWallgren s solution I currently use:
class PropertyBaseModel(BaseModel): """ Workaround for serializing properties with pydantic until https://github.com/samuelcolvin/pydantic/issues/935 is solved """ @classmethod def get_properties(cls): return [prop for prop in dir(cls) if isinstance(getattr(cls, prop), property) and prop not in ("__values__", "fields")] def dict( self, *, include: Union['AbstractSetIntStr', 'MappingIntStrAny'] = None, exclude: Union['AbstractSetIntStr', 'MappingIntStrAny'] = None, by_alias: bool = False, skip_defaults: bool = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, ) -> 'DictStrAny': attribs = super().dict( include=include, exclude=exclude, by_alias=by_alias, skip_defaults=skip_defaults, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none ) props = self.get_properties() # Include and exclude properties if include: props = [prop for prop in props if prop in include] if exclude: props = [prop for prop in props if prop not in exclude] # Update the attribute dict with the properties if props: attribs.update({prop: getattr(self, prop) for prop in props}) return attribs
In this way, I can get (read) the property but couldn't set it due to: https://github.com/samuelcolvin/pydantic/issues/1577 . Do you have any clear workaround to set it?
Based on @MartinWallgren s solution I currently use:
class PropertyBaseModel(BaseModel): """ Workaround for serializing properties with pydantic until https://github.com/samuelcolvin/pydantic/issues/935 is solved """ @classmethod def get_properties(cls): return [prop for prop in dir(cls) if isinstance(getattr(cls, prop), property) and prop not in ("__values__", "fields")] def dict( self, *, include: Union['AbstractSetIntStr', 'MappingIntStrAny'] = None, exclude: Union['AbstractSetIntStr', 'MappingIntStrAny'] = None, by_alias: bool = False, skip_defaults: bool = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, ) -> 'DictStrAny': attribs = super().dict( include=include, exclude=exclude, by_alias=by_alias, skip_defaults=skip_defaults, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none ) props = self.get_properties() # Include and exclude properties if include: props = [prop for prop in props if prop in include] if exclude: props = [prop for prop in props if prop not in exclude] # Update the attribute dict with the properties if props: attribs.update({prop: getattr(self, prop) for prop in props}) return attribs
If the property itself is a BaseModel this does not serialize that. But this can be easily done by replacing the lines above with:
props = [(prop, getattr(self, prop)) for prop in props]
if props:
attribs.update({
name: to_dict(value) if issubclass(type(value), BaseModel) else value
for name, value in props
})
I'm using a more compact version:
class PropertyBaseModel(BaseModel):
"""
Workaround for serializing properties with pydantic until
https://github.com/samuelcolvin/pydantic/issues/935
is solved
"""
@classmethod
def get_properties(cls):
return [
prop for prop in dir(cls)
if isinstance(getattr(cls, prop), property) and prop not in ("__values__", "fields")
]
def dict(self, *args, **kwargs) -> 'DictStrAny':
self.__dict__.update({prop: getattr(self, prop) for prop in self.get_properties()})
return super().dict(*args, **kwargs)
This version has the disadvantage that the property getter is called even if it is on the exclude list.
I actually would prefer the solution proposed by @JosXa:
class Rectangle(BaseModel): width: bool length: int area: int = ComputedProperty(lambda s: s.width * s.length)
This should be fairly easy to implement in pydantic. Probably mostly needs a change in main._get_value (besides the introduction of ComputedProperty).
This fits better to my use case. The property setter is not needed in that use case.
@iedmrc I also have a working solution so that the property setters work:
from typing import TYPE_CHECKING, List, Union, Any
from pydantic import BaseModel
if TYPE_CHECKING:
from pydantic.typing import AbstractSetIntStr, MappingIntStrAny, DictStrAny
class PropertyBaseModel(BaseModel):
"""
Workaround for serializing properties with pydantic until
https://github.com/samuelcolvin/pydantic/issues/935
is solved
"""
def __init__(self, **data):
super().__init__(**data)
for getter, setter in self.get_properties():
if getter in data and setter:
getattr(type(self), setter).fset(self, data[getter])
@classmethod
def get_properties(cls):
attributes = {prop: getattr(cls, prop) for prop in dir(cls)}
properties = {
name: attribute
for name, attribute in attributes.items()
if isinstance(attribute, property) and name not in ("__values__", "fields")
}
setters = {prop.fget: name for name, prop in properties.items() if prop.fset}
return [(name, setters.get(prop.fget))
for name, prop in properties.items()
if prop.fget and not prop.fset]
def dict(self, *args, **kwargs) -> 'DictStrAny':
self.__dict__.update(
{getter: getattr(self, getter) for getter, setter in self.get_properties()})
return super().dict(*args, **kwargs)
Most helpful comment
I wonder if implementing something like the schema/field as a decorator would be a good approach?
And on init, the internal fields dict could grab attributes that have the decorator and act accordingly.
I'd say the property above would set to
readOnlyon the schema output.Could possibly used on methods to support
writeOnly?I'd be happy to get a PR going with some guidance if this feature is desired by others (esp the property serializtion, not so much the method)