Using pydantic to check types without building Models or dataclasses.
Four modes of usage:
@typecheck
def foobar(x: int, y: float, x: List[Dict[str, Tuple[int, int]]]):
...
The typecheck decorator would check types when foobar is called, optional arguments for the decorator to:
(Usage: at library boundaries to guarentee correct usage)
A context manager which uses sys.settrace to check types on all function calls inside your code base.
This won't be at all easy and will slow down code execution a lot, but would be a big win. (I think?)
Would need to with with existing trace functions eg. coverage and allow that to still work properly.
Also the arguments passed would need to be checked but not modified so behavior isn't changed.
(usage: see below)
which uses the context manager above to do runtime checks during testing.
use runpy.run_path to run a python program inside the context manager above and therefore with type checks.
(probably most useful as a sanity check or or adhock checks)
Is this something anyone would fine useful?
The settrace/pytest extension approach might end up being infeasible, which would just leave the decorator.
At first thought, this seems outside the scope of pydantic. As an "outsider", I view pydantic more as a way of serializing/deserializing Python models rather than a generic type checking code. A big upside of something like pydantic for us is it is clean code that is very lean and does not have dependancies.
Thanks @dgasmith, I understand where you're coming from.
I doubt this would need any more dependencies, but it would mean more code. Maybe a lot more.
I originally conceived pydantic exactly as you think about it. The reason this occurred me was the large amount of interest in parsing and checking python objects and talking to be people at meetups about pydantic, they often seem to want to check existing python objects, not "external" data.
Perhaps the best thing would be to implement the decorator and leave any sys.settrace/pytest magic to another (perhaps dependent) package.
Thinking specifically about option 1, decorators:
Although I'm not needing this feature directly (at least right now), I see how it's not that difficult to achieve, a lot of the needed code is already in place (it's mainly the Field class). The additional code required is actually not that much.
In fact, that's more or less what I'm doing in https://github.com/tiangolo/fastapi, I'm using Pydantic's Field to annotate, serialize and validate parameters from functions (API route parameters), apart from models too. And it looks quite similar to that example. That's why I think (at least for that specific case) that the required extra code is not that much.
Here are some projects that can be already used to check types in runtime: https://github.com/agronholm/typeguard and https://github.com/RussBaz/enforce. pydantic in my opinion all about data validation when we need to validate, for example, HTTP request body.
I think that's fair.
decorator usage would be useful and I think falls into the same broad case as pydantic's other applications.
So let's do the decorator case but not the others.
It would be great to contribute to one or more of the projects that already do run-time type checking rather than to create a new one inside pydantic.
One of the arguments seems to be that there is this super nice code that deals with types which is only used for (de-)serialization and it would be so easy to make it do validation as well.
If that is the case, it would be of great benefit to extract that code into a separate package, use it as a dependency in pydantic and make the case to existing run-time type validator packages such as the ones mentioned by @Gr1N to adopt your package as a dependency.
We've been doing this with models, so the decorator would be really helpful
@Victor-Savu I understand where you're coming from, but I'm afraid that doesn't quite work.
One of the arguments seems to be that there is this super nice code that deals with types which is only used for (de-)serialization and it would be so easy to make it do validation as well.
Type checking, data parsing and validation are all one thing within pydantic, it's not possible to extract one without the others.
There wouldn't really be any point in extracting the logic from pydantic into another library:
There's a blurred line between runtime type checking and data validation - in that sense pydantic is already a runtime type checking library as well as a data parsing library.
validators are a logical extension of what pydantic already does which would make some interfaces much easier to write than currently.
@samuelcolvin I have tried something similar in an experimental repo. It should cover most use cases, including using pydantic types as function arguments. If you find it useful, I can submit a PR.
https://github.com/hankehly/pydantic-playground/blob/master/arguments.py
Hi @hankehly, thank you very much for the offer. I think on this occasion it would probably be best for me to work on this - it may take some significant changes to pydantic code base and I'd be happier doing that myself. It will also avoid the situation where we waste both our times with me commenting on every line of your PR asking you to change things around.
I'll try and work on this fairly soon, in doing so I might get some inspiration from your code above and if so will of course if you credit.
I had the similar idea recently and have some work here (never finished though): https://github.com/MrMRrobat/got_it
It has a bit different concept: values may be changed and not just type-checked.
Not asking to make a PR, as it meant as a separate package anyway, but maybe you'll find something useful in here.
I am at last working on this on #1179, feedback welcome.
Most helpful comment
I am at last working on this on #1179, feedback welcome.