Keras: Conv1D after MaxPooling1D Negative Dimension Error

Created on 5 May 2018  路  1Comment  路  Source: keras-team/keras

Hello,

I am creating a sequential model with alternating Conv1D and MaxPooling1D layers.

The last line of this code raises an error

model = Sequential()
model.add(Embedding(len(word_index) + 1, embedding_size,
                    weights=[embedding_matrix],
                    input_length=MAX_SEQUENCE_LENGTH,
                    trainable=True))
model.add(SpatialDropout1D(0.2))
model.add(Conv1D(128, 3, activation='relu'))
model.add(MaxPooling1D(3))
model.add(Conv1D(128, 3, activation='relu'))
model.add(MaxPooling1D(3))
model.add(Conv1D(128, 3, activation='relu'))

The error is

File "/Users/david/PycharmProjects/deep-learning-with-keras/lib/python2.7/site-packages/tensorflow/python/framework/common_shapes.py", line 627, in call_cpp_shape_fn
require_shape_fn)
File "/Users/david/PycharmProjects/deep-learning-with-keras/lib/python2.7/site-packages/tensorflow/python/framework/common_shapes.py", line 691, in _call_cpp_shape_fn_impl
raise ValueError(err.message)
ValueError: Negative dimension size caused by subtracting 3 from 2 for 'conv1d_3/convolution/Conv2D' (op: 'Conv2D') with input shapes: [?,1,2,128], [1,3,128,128].

What is goinf wrong here? This example should to my knowledge be consistent with this Keras blog post.

Most helpful comment

How long is the sequence you're trying to input? The problem is probably due to you using padding='valid' option (the default) in the convolution while your sequence is shorter than 3 units. If this is the case, this is not a bug in Keras, but simply due to the fact that you can't use a convolution kernel of width 3 if your data is less than 3 units.

If this is indeed the case, you can fix it by either using longer data sequences or using another padding with convolutions (which will just append zeros to the data to keep it the same size after applying the convolution).

>All comments

How long is the sequence you're trying to input? The problem is probably due to you using padding='valid' option (the default) in the convolution while your sequence is shorter than 3 units. If this is the case, this is not a bug in Keras, but simply due to the fact that you can't use a convolution kernel of width 3 if your data is less than 3 units.

If this is indeed the case, you can fix it by either using longer data sequences or using another padding with convolutions (which will just append zeros to the data to keep it the same size after applying the convolution).

Was this page helpful?
0 / 5 - 0 ratings