Keras: Lambda Layer function with arguments?

Created on 3 Mar 2016  路  17Comments  路  Source: keras-team/keras

Is there a way to pass additional arguments to the function in Lambda Layer?

def get_Y(X):
    return X[:, :options.xmaxlen, :]  # get first xmaxlen elem from time dim

...
model.add_node(Lambda(get_Y, output_shape=(L, k)), name='Y', input='dropout')

Please make sure that the boxes below are checked before you submit your issue. Thank you!

  • [x] Check that you are up-to-date with the master branch of Keras. You can update with:
    pip install git+git://github.com/fchollet/keras.git --upgrade --no-deps
  • [x] If running on Theano, check that you are up-to-date with the master branch of Theano. You can update with:
    pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
  • [x] Provide a link to a GitHub Gist of a Python script that can reproduce your issue (or just copy the script here if it is short).

Most helpful comment

Is there an elegant way to change the parameters that are being passed to the lambda function after the model was already compiled (e.g. change the a and b in the example above)?

All 17 comments

Try something like this:

class XMaxLen:
    def __init__(self, maxlen):
        self.maxlen = maxlen
    def __call__(self, X):
        return X[:, :self.maxlen, :]
model.add_node(Lambda(XMaxLen(10), ...), ...)

I get the error,

Traceback (most recent call last):
  File "/Users/Shyam/code/java_code/algebra-classifier/nn_attention/amodel.py", line 324, in <module>
    model = build_model(options)
  File "/Users/Shyam/code/java_code/algebra-classifier/nn_attention/amodel.py", line 154, in build_model
    model.add_node(Lambda(XMaxLen(10), output_shape=(L, k)), name='Y', input='dropout')
  File "/Users/Shyam/anaconda/lib/python2.7/site-packages/keras/layers/core.py", line 1509, in __init__
    assert hasattr(function, 'func_code'), ('The Lambda layer "function"'
AssertionError: The Lambda layer "function" argument must be a Python function.

Ok, this does seem more complicated than I thought. I don't know why that function uses marshal to extract just the code rather than grab the function as a whole (or just pickle instead of marshal), and if it could be made to handle callable classes too. Sorry for the noise!

@fchollet @farizrahman4u please offer a solution/hack to get around this. I have to manually change this parameter everytime so far!

Well as a hack/workaround, you should be able to use a global variable,
right?

That's just problematic in more complex environments (say if you have
multiple models in the same script, etc.).

I cannot use a global variable because I have 3 models (of increasing complexity), and have to interact from the same script. Adding a global variable means lots of code reduplication etc.

Interestingly, I dont think a global variable will do the job. I tried,

options=get_params()

...
def get_Y(X):
    print(options.xmaxlen)
    return X[:, :110, :]  # get first xmaxlen elem from time dim

# for sanity check
def f():
    print(options.xmaxlen)

if __name__=='__main__':
    f() # works, which means options is global
    build_model(options) # give the error below!


  File "/Users/Shyam/anaconda/lib/python2.7/site-packages/keras/layers/core.py", line 575, in get_output
    X = self.layers[i].get_output(train)
  File "/Users/Shyam/anaconda/lib/python2.7/site-packages/keras/layers/core.py", line 1548, in get_output
    return func(X)
  File "/Users/Shyam/code/java_code/algebra-classifier/nn_attention/amodel.py", line 90, in get_Y
    print(options.xmaxlen)
NameError: global name 'options' is not defined


My understanding is that the function is compiled elsewhere (maybe keras' core.py) and that is where options should have been global. Is that right @fchollet?

@fchollet @farizrahman4u please offer a solution/hack to get around this. I have to manually change this parameter everytime so far!

OK, suppose you need a Lambda that does f(x) = a * x + b and you often change the values for a and b (or the user inputs values for a and b or whatever).

import keras.backend as K
from keras.layers import*
from keras.models import Sequential

# This is the only place you will have to change if you want change the values for a and b
a = 10
b = 20

params = {'a':a, 'b':b}

setattr(K, 'params', params)

def f(x):
    a = K.params['a']
    b = K.params['b']
    y = a * x + b
    return y

model = Sequential()
model.add(Dense(input_dim=10, output_dim=20))
model.add(Lambda(f))

model.compile(loss='mse', optimizer='sgd')

From this point I think you can implement it in a Graph too.

Ok, this does seem more complicated than I thought. I don't know why that function uses marshal to extract just the code rather than grab the function as a whole

I implemented marshalling so that the layer could be serialized. The idea of callable objects as lambda functions seems superfluous to me . Why create an object of class XMaxLen and then wrap it inside a Lambda layer ?You could just make XMaxLen inherit from Layer and add it to the model.

This works!

Btw, why was the rationale behind making the api like so? I think it will be better to allow lambda to allow params, but maybe I am missing something.

Follow up on the reply from @farizrahman4u,

I agree that a 'lambda' is a function in itself, but that should not preclude applying say a partial function (which would allow me to do above in a lambda layer without K.params). I still feel this should allow parametric behavior. But this works too for now!

Just a nitpick that in the mathematics lambda concept, a crucial part of
it is that you can produce the lambda as an output from another lambda
(which would be parametrized by the a, b), i.e. a currification.

I deleted that comment because the arguments a and b could be simply considered as inputs to the lambda function. I will be adding a new feature to Lambda layer which will let you pass additional arguments to the lambda function.

@pasky @shyamupa Check this out #1911

:+1: for the quick response on desired feature. Lambda layers just became more powerful.

@farizrahman4u will this also fix #1863?

Is there an elegant way to change the parameters that are being passed to the lambda function after the model was already compiled (e.g. change the a and b in the example above)?

@eyal-str, to update the parameters in your Lambda function, define the Lambda layer first and then set the arguments of that layer before calling it.

update_seq_layer = Lambda(update_seq) # example Lamda layer with update_seq function
update_seq_layer.arguments = {'a': a, 'b':b} # changing the parameters of 'a' and 'b' in function
new_seq = update_seq_layer(x) # calling layer to evaluate Lambda function on input 'x'

You can set the arguments at any point before calling the layer to evaluate the Lambda function.

In my model, I call the Lambda function within a for loop and pass in the index of the loop to update my input sequence at that timepoint.

Was this page helpful?
0 / 5 - 0 ratings