Keras: Embedding set weighs

Created on 17 Dec 2016  路  17Comments  路  Source: keras-team/keras

Hello,
thanks for creating the great Keras!!
I have one issue:
I create weights with np.zeros
embedding_matrix = np.zeros((max_features , embedding_dims))
Then I use
e = Embedding( max_features, embedding_dims, input_length=maxlen, weights=[embedding_matrix], trainable=False, )
And I get the error:
ValueError: You called set_weights(weights) on layer "embedding_1" with a weight list of length 1, but the layer was expecting 0 weights. Provided weights: [array([[-0.01543641, 0.00745765, 0.0926055 , .....

If I create the Embedding without setting the weights and query the weights with w = e.get_weights(), w is an empty list. What am I doing wrong?
Thanks a lot,
Ernst

stale

Most helpful comment

I have faced a similar problem and found the solution is to add the layer to an existing model first, and then invoke set_weights. So for your example. I propose to do the following:

model = Sequential()
# create other layers
...
e = Embedding(
                  max_features,
                  embedding_dims,
                  input_length=maxlen,
                  weights=[embedding_matrix],
                  trainable=False,
                  )
model.add(e)
e.set_weights(weights)

All 17 comments

I was having the same issue a few hours ago. A reinstall of Keras fixed the issue. See if that helps.

Nilabhra,
thanks a lot for your maii! I uninstalled Keras and installed a fresh git clone, but the problem still persists.

Kind regards
Ernst

Removed Keras and installed Keras with pip, and now it works. It looks like there might be a bug in the git master branch.

Yes, I did the same. I am sorry, I should have mentioned that I installed from pip the second time. This is a serious problem. @fchollet has it been patched in the master branch?

I had the same issue, and I install and reinstall keras several times ,bug issue still there ! Can anybody who solved the issue give some suggestions ? Thanks

Same issue. Seems only exists on 1.2 version, as I resolved it by installing a previous 1.1 version.

It seems that when trainable=False, weights and non_trainable_weights will both be [], hence set_weights() would raise an error. But I am not sure why weights or non_trainable_weights are [] when trainable=False, probably @fchollet can take a quick look?

update: seems solved. Below attached a test code which should run fine.

idx = Input(shape=(1,), dtype='int32')
Emb = Embedding(10, 10, weights=[np.zeros((10, 10))], trainable=False)
emb = Emb(idx)

@chentingpc I did a quick try, seems issue still there, I use version 1.2 and simply do pip install --upgrade keras . By the way, which backend did u use. My setup is : tensorflow under version 0.12.0-rc1

@stone8oy You may need to install the most recent commit. You can do so by git clone the repo, and use command python setup.py install. It works for me on tensorflow backend.

Solved,thanks a lot @chentingpc

@chentingpc I had the same issue. I used Theano as the backend. My keras version is 1.2.0; and the theano version is 0.8.2

@xiaoleihuang

Have you tried to install __latest commit__ ?

You can install latest commit by use command pip install git+https://github.com/fchollet/keras.git or the way provided by @chentingpc.

In my situation, it fixed this issue.

I think I have this same problem. As an example if I do

import numpy as np
from keras.layers import Convolution2D, Input

x = Input(shape=(10,10,3))
x = Convolution2D(16, 5, 5, weights=[np.zeros((3,5,5,16))], border_mode='same', trainable=False, bias=False)(x)

with keras 1.2 I get the error

ValueError: You calledset_weights(weights)on layer "convolution2d_1" with a weight list of length 1, but the layer was expecting 0 weights. Provided weights: [array([[[[ 0., 0., 0., ..., 0., 0., 0.],
Sounds like I it is fixed in the master branch.

I have the same issue
I upgrade to master branch using
pip install git+https://github.com/fchollet/keras.git --upgrade
But I still the error:

ValueError: You called set_weights(weights) on layer "predictions" with a weight list of length 2, but the layer was expecting 0 weights. Provided weights: [array([[-0.06838475, -0.01250052, -0.04821445, .....

 old_config = old_predictions.get_config()
 new_config = dict(old_config)
 new_config['output_dim'] = new_config['output_dim'] + 1
 new_predictions = layer_from_config({'class_name':type(old_predictions),'config':new_config})

Note that for this layer: trainable is True:
The full config itself is:

{'W_constraint': None, 'b_constraint': None, 'name': 'predictions', 'activity_regularizer': None, 'trainable': True, 'init': 'glorot_uniform', 'bias': True, 'activation': 'softmax', 'input_dim': 4096, 'b_regularizer': None, 'W_regularizer': None, 'output_dim': 103}

I solved this problem by updating keras version to 1.2.2

I have faced a similar problem and found the solution is to add the layer to an existing model first, and then invoke set_weights. So for your example. I propose to do the following:

model = Sequential()
# create other layers
...
e = Embedding(
                  max_features,
                  embedding_dims,
                  input_length=maxlen,
                  weights=[embedding_matrix],
                  trainable=False,
                  )
model.add(e)
e.set_weights(weights)

Although this thread is too old, I still feel like I should point out how to fix this error.
The problem can be solved by first building the embedding layer and then assigning the weights using set_weight

e.g

embedding_layer = tensorflow.keras.layers.Embedding(vocabulary_size, embedding_dims)
embedding_layer.build((None, ))
embedding_layer.set_weights([your_embedding_matrix])
embedding_layer.trainable = False

Was this page helpful?
0 / 5 - 0 ratings

Related issues

fredtcaroli picture fredtcaroli  路  3Comments

amityaffliction picture amityaffliction  路  3Comments

braingineer picture braingineer  路  3Comments

MarkVdBergh picture MarkVdBergh  路  3Comments

somewacko picture somewacko  路  3Comments