Keras: Custom Constraints in Keras layers

Created on 20 Oct 2017  路  6Comments  路  Source: keras-team/keras

Hello Everyone,

I want to add custom constraints on the weights of a layer. For example, I want to impose sparsity constraints on the weights of a layer. Is there a way to write our own custom constraints while learning the weight parameters of a layer. I could not find much documentation on keras docs for constraints.

Most helpful comment

Hi!

Look at how in-built constraints are implemented:
https://github.com/keras-team/keras/blob/master/keras/constraints.py

I think you can also create a class inheriting from Constraint:

from keras.constraints import Constraint

class CustomConstraint (Constraint):
    def __init__(self, your, parameters, go, here):
        self. your = your
        self. parameters = parameters
        .....

    def __call__(self, w):
        new_w = write_your_custom_transformations_here(w)
        return new_w

And then use CustomConstraint in the same way as in-built:

model.add(Dense(64, kernel_constraint= CustomConstraint(your', parameters', go', here')))

Hope this helps :)

All 6 comments

I'm also looking to implement the same. Is there any way to do this?

Hi!

Look at how in-built constraints are implemented:
https://github.com/keras-team/keras/blob/master/keras/constraints.py

I think you can also create a class inheriting from Constraint:

from keras.constraints import Constraint

class CustomConstraint (Constraint):
    def __init__(self, your, parameters, go, here):
        self. your = your
        self. parameters = parameters
        .....

    def __call__(self, w):
        new_w = write_your_custom_transformations_here(w)
        return new_w

And then use CustomConstraint in the same way as in-built:

model.add(Dense(64, kernel_constraint= CustomConstraint(your', parameters', go', here')))

Hope this helps :)

@kochkinaelena In a bidirectional wrapper, can we use two separate custom kernel constraints: one for the forward layer and other for backward layer?

Hi,

I don't think you can easily plug in two different kernel constraints into bidirectional wrapper, but you could use two RNN layers of your choice (one of which goes backwards) and then merge them in the way you want, e.g:

  layer1 = LSTM(64, kernel_constraint= CustomConstraint1, go_backwards=False)(layer0)
  layer2 = LSTM(64, kernel_constraint= CustomConstraint2, go_backwards=True)(layer0)
  layer3 = keras.layers.add([layer1, layer2])

This method doesn't work for the LSTM layer. Same CustomConstraint when used for Dense layer works but for LSTm gives error 'Unknown Constraint'. Anybody facing similar problems?

This method doesn't work for the LSTM layer. Same CustomConstraint when used for Dense layer works but for LSTm gives error 'Unknown Constraint'. Anybody facing similar problems?

Hey @prakharg24, did you figure this out? I'm facing this issue in TF2-Keras with a Dense Layer!

Was this page helpful?
0 / 5 - 0 ratings