Mypy: Ability to use Callable type alias when annotating functions

Created on 4 Jun 2016  路  18Comments  路  Source: python/mypy

If I have a type alias that is a Callable and a function that matches that type alias, it would be nice to annotate the function using just the type alias.

Example:

Validator = Callable[[Any], Optional[str]]

def check_string(val):
    # type: Validator
    ... # check if val is a string, return an error message if it isn't
needs discussion topic-depends-on-pep-change

Most helpful comment

As long as we don't mess with the standard PEP 484 way to define functions, I think that this is a reasonable proposal. Perhaps someone can come up with a draft implementation as a PR? Then we could put it in mypy_extensions and write a PEP about it.

All 18 comments

But I ask myself, how would we do that in Python 3 using inline function annotations (PEP 3107)? I'm not sure this is a desirable feature.

In a decorator you may want to give an inner function (that is going to be returned by the decorator) a type that is a type variable (if mypy supported that, which it doesn't yet). That has the same problem with regards to inline function annotation syntax. (Or if we had variadic type variables for abstracting over arbitrary function argument lists (https://github.com/python/typing/issues/193) the same would apply to some extent.)

Would a decorator be suitable for this? Something along the lines of

from typing import Callable, Optional, coerce

Validator = Callable[[Any], Optional[str]]

@coerce(Validator)
def check_string(val):
    ... # raise if ``val`` is invalid
    return val

coerce(type)(function) returns function. The decorator @coerce(type) indicates to the type checker that it should attempt to unify the type of the decorated function with type. Another example use case:

from typing import Callable, NamedTuple, coerce
import math

Constraint = Callable[[float], float]

class Clip(Constraint, NamedTuple):
    lower: float
    upper: float

    def __call__(self, x: float) -> float:
        return max(self.lower, min(self.upper, x))

@coerce(Constraint)
def softplus(x: float) -> float:
    return math.log(1 + math.exp(x))

@coerce(Constraint)
def null(x: float) -> float: 
    return x

Without coerce, there's no easy way for me to verify that softplus and null are Constraints.

Again, it looks like this is the case where @declared_type() PR #3291 would work as is, for example:

Validator = Callable[[Any], Optional[str]]

@declared_type(Validator)
def check_string(val):
    # some complex logic here
    ...

see the discussion in https://github.com/python/mypy/issues/2087

I had this issue as well.

I wanted to use the type as a interface:
```python
from typing import Optional, Callable, Tuple, TypeVar

T = TypeVar("T")
parser = Callable[[str], Optional[Tuple[str, T]]]

cant use parser[str] as type for take_one

def take_one(s):
return s[1:], s[1]

def match(to_match: str) -> parser[str]:
# cant use parser[str] as type for p
def p(s: str) -> Optional[Tuple[str, str]]:
if s.startswith(to_match):
return s[len(to_match):], to_match
else:
return None
return p
````

and have a function with signature def or(first:parser[T], second: parser[T]) -> parser[T]: ... even using newType for parser (havent figured out if thats posible yet).

I also had this issue - I have a bunch of super short handler that take a relatively large number of arguments, ideally had the argument typed as a single Tuple and I could alias the type to a variable. However, since tuple unpacking in function definitions is not supported in python 3, we have to move away from that system. This resulted in having to define each variable one one line since the type got too long, and thus some handlers with function definitions longer than their code. Being able to just alias the whole function type would solve that problem.

An alternative would be being able to declare an argument type alias similar to how I was defining a single Tuple type before:

HandlerArgType =  ArgAlias[Arg1Type, Arg2Type, Arg3Type, Arg4Type, Arg5Type, Arg6Type]
HandlerReturnType = SomeTypeHere # this already works

def handler_a(Arg1, Arg2, Arg3, Arg4, Arg5, Arg6):
    # type: (HandlerArgType) -> HandlerReturnType
    ....

I did a cursory skim through the PEP but could not find this issue being addressed there. Could we comment the resolution here, as google anyway drops me here?

(Sorry, what I posted was for a different issue, so I deleted it here. What PEP did you skim?)

@gvanrossum I went through PEP 3107.

What were you looking for? PEP 3107 doesn't mention Callable -- that was introduced in PEP 484, (which chronologically came after PEP 3107).

Thank you for pointing me to PEP 484 :) and sorry for the delayed response. I wanted to make sure that I read the PEP in completion and not reply in haste.

From what I understand from PEP 484, the suggestion in this issue is not supported. To be a little more explicit, here is example of the use case

# in ./activations.py 

ActivationFunction = Callable[[float], float]

def sigmoid(x):
     # type: ActivationFunction
     return 1 / (1 + math.exp(-x))

def relu(x):
     # type: ActivationFunction
     return x if x > 0. else 0.

#-----------------------------
# in ./loader.py
import activations

def load_activation(name: str) -> ActivationFunction:
    return getattr(activations, name)

Currently this can be achieved by

class ActivationFunction(metaclass=abc.ABCMeta):
    """Base class for all activation functions.

    .. note::

        Please do not add any other method or store member fields unless absolutely
        necessary. Keep this class extremely slim moving logic into pure functions to
        aid testing.
    """

    @abc.abstractmethod
    def transform(x: float) -> float:
         raise NotImplementedError


class Sigmoid(ActivationFunction):

     def transform(x):
         return 1 / (1 + math.exp(-x))


class Relu(ActivationFunction):

     def transform(x):
         return x if x > 0. else 0.

Preferably, I would be able to use functions and use the typing system to check the function signature and the return type instead of creating classes. From at least what I gathered from the PEP-484, this is not supported use case. Please do correct me if I am wrong.

Can you do this with variable annotations?

ActivationFunction = Callable[[Float], Float]

sigmoid: ActivationFunction
def sigmoid(x):
    return 1 / (1 + math.exp(-x))

the suggestion in this issue is not supported

That is correct. There is not even a finished design -- your proposal (use # type: ActivationFunction in the function body) does not mesh well with the preferred Python 3 syntax.

@gvanrossum I see I see :)
@PaperclipBadger 馃憤
I do really like that syntax 馃挴It does not work currently, though.

# ./activation.py
from typing import Callable

import math


ActivationFunction = Callable[[float], float]


sigmoid: ActivationFunction
def sigmoid(x):
     return 1 / (1 + math.exp(-x))


relu: ActivationFunction
def relu(x):
     return x if x > 0. else 0.
$ mypy activation.py
activation.py:10: error: Name 'sigmoid' already defined on line 9
activation.py:15: error: Name 'relu' already defined on line 14

For a language like python where everything is an object I argue that the syntax proposed by @PaperclipBadger is the expected behavior since

i: int
i = 1  # works

so why should this not work for functions (since they are also objects)?


Separating the type information from the actual implementation is also a thing in functional languages like Haskell where one would define a simple function that adds two integers like so:

add :: Integer -> Integer -> Integer
add x y =  x + y

From that point of view something like the following looks quite natural.

add: Callable[[int, int], int]
def add(a,b): 
    return a+b

In my opinion this is also a much nicer way of adding gradual typing to existing code bases compared to using stub files.

As long as we don't mess with the standard PEP 484 way to define functions, I think that this is a reasonable proposal. Perhaps someone can come up with a draft implementation as a PR? Then we could put it in mypy_extensions and write a PEP about it.

I found myself looking at a few related issues attempting to limit the scope of input for the built in Decimal type.

Not sure if this is idiomatic, but it seems to work alright.

from decimal import Decimal as _Decimal, Context

StrictDecimalValue = typing.TypeVar(
    'StrictDecimalValue', str, _Decimal, "_StrictDecimal"
)
class _StrictDecimal(_Decimal):
    def __new__(
            cls, value: StrictDecimalValue,
            context: typing.Optional[Context] = None
    ) -> _Decimal:
        if not isinstance(value, (str, _Decimal)):
            raise ValueError("StrictDecimal must receive a str or Decimal type for value")
        if context is None:
            context = Context(prec=16, rounding=decimal.ROUND_UP)
        return _Decimal(value, context).normalize()

StrictDecimal = typing.TypeVar('StrictDecimal', _Decimal, _StrictDecimal, "Decimal")
Decimal = _StrictDecimal

ZERO = Decimal('0.0')
ONE = Decimal(1.0)  # error

If this proposal could consider this type of use-case, or if you know of a better way I can do this today, I'd appreciate it.

I don't think that this is important enough for a non-standard mypy extension. If somebody wants to move this issue forward, discussion can continue on typing-sig@ or the typing issue tracker.

Was this page helpful?
0 / 5 - 0 ratings