Keras: Now that TimeDistributedMerge is removed, what is the replacement of it?

Created on 12 Apr 2016  Â·  8Comments  Â·  Source: keras-team/keras

I have experienced the latest version of Keras in github, and noticed that there is a big difference from previous versions.
TimeDistributed has now been separated as a wrapper.
TimeDistributedDense can be replaced by TimeDistributed(Dense())
how about the TimeDistributedMerge?
should we write an Lambda layer by ourselves?

stale

Most helpful comment

@around1991 I think TimeDistributedMerge is removed because it can be implemented via Lambda layer, like:

time_distributed_merge_layer = Lambda(function=lambda x: K.mean(x, axis=1), 
                   output_shape=lambda shape: (shape[0],) + shape[2:])

All 8 comments

Doesn't TimeDistributed(Merge()) work?

@carlthome
i just wondering if Merge layer can take no argument in 1.0.0.
for older version, Merge can not be used in format of Merge() with nothing in brackets.

@carlthome: the Merge layer merges several tensors into one, whereas TimeDistributedMerge just collapsed one axis of a single tensor.

The replacement is to use the functional API to build your models:
http://keras.io/getting-started/functional-api-guide/

On 13 April 2016 at 10:19, around1991 [email protected] wrote:

@carlthome https://github.com/carlthome: the Merge layer merges several
tensors into one, whereas TimeDistributedMerge just collapsed one axis of a
single tensor.

—
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub
https://github.com/fchollet/keras/issues/2271#issuecomment-209551912

Sorry, I don't get how one would do sum-pooling (say) on the output of an LSTM with return_sequences=True in the new API?

@around1991 I think TimeDistributedMerge is removed because it can be implemented via Lambda layer, like:

time_distributed_merge_layer = Lambda(function=lambda x: K.mean(x, axis=1), 
                   output_shape=lambda shape: (shape[0],) + shape[2:])

@DingKe Hi

lstm_2 = LSTM(128, return_sequences=True)(dropout_1)
lstm_goal = LSTM(128, return_sequences=True)(masking_goal)
merge_direction = TimeDistributed(Merge([lstm_2, lstm_goal], mode='concat'))

If I want to merge two tensors (like codes showed above), how to write a lambda layer? Since the lambda layer take one argument, how to concatenate two tensors?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

LuCeHe picture LuCeHe  Â·  3Comments

vinayakumarr picture vinayakumarr  Â·  3Comments

braingineer picture braingineer  Â·  3Comments

oweingrod picture oweingrod  Â·  3Comments

snakeztc picture snakeztc  Â·  3Comments