Keras: How to set layer name in functional api and then find layer by the name ?

Created on 20 May 2016  路  7Comments  路  Source: keras-team/keras

In traditional graph api, I can give a name for each layer and then find that layer by its name.

How can I do this in functional api?

Most helpful comment

Look at this example from the functional API page in the docs, which I edited to add names to the dense layers. It would seem that you would just pass name = "name" to your layer constructor. To get the layer by name use:
model.get_layer("name")

from keras.layers import Input, Embedding, LSTM, Dense, merge
from keras.models import Model

# headline input: meant to receive sequences of 100 integers, between 1 and 10000.
# note that we can name any layer by passing it a "name" argument.
main_input = Input(shape=(100,), dtype='int32', name='main_input')

# this embedding layer will encode the input sequence
# into a sequence of dense 512-dimensional vectors.
x = Embedding(output_dim=512, input_dim=10000, input_length=100)(main_input)

# a LSTM will transform the vector sequence into a single vector,
# containing information about the entire sequence
lstm_out = LSTM(32)(x)

auxiliary_loss = Dense(1, activation='sigmoid', name='aux_output')(lstm_out)

auxiliary_input = Input(shape=(5,), name='aux_input')
x = merge([lstm_out, auxiliary_input], mode='concat')

# we stack a deep fully-connected network on top
x = Dense(64, activation='relu', name="dense_one")(x) # names are added here
x = Dense(64, activation='relu', name="dense_two")(x)
x = Dense(64, activation='relu', name="dense_three")(x)

# and finally we add the main logistic regression layer
main_loss = Dense(1, activation='sigmoid', name='main_output')(x)
model = Model(input=[main_input, auxiliary_input], output=[main_loss, auxiliary_loss])
model.compile(optimizer='rmsprop', loss='binary_crossentropy',
              loss_weights=[1., 0.2])
model.get_layer("dense_one")

All 7 comments

Look at this example from the functional API page in the docs, which I edited to add names to the dense layers. It would seem that you would just pass name = "name" to your layer constructor. To get the layer by name use:
model.get_layer("name")

from keras.layers import Input, Embedding, LSTM, Dense, merge
from keras.models import Model

# headline input: meant to receive sequences of 100 integers, between 1 and 10000.
# note that we can name any layer by passing it a "name" argument.
main_input = Input(shape=(100,), dtype='int32', name='main_input')

# this embedding layer will encode the input sequence
# into a sequence of dense 512-dimensional vectors.
x = Embedding(output_dim=512, input_dim=10000, input_length=100)(main_input)

# a LSTM will transform the vector sequence into a single vector,
# containing information about the entire sequence
lstm_out = LSTM(32)(x)

auxiliary_loss = Dense(1, activation='sigmoid', name='aux_output')(lstm_out)

auxiliary_input = Input(shape=(5,), name='aux_input')
x = merge([lstm_out, auxiliary_input], mode='concat')

# we stack a deep fully-connected network on top
x = Dense(64, activation='relu', name="dense_one")(x) # names are added here
x = Dense(64, activation='relu', name="dense_two")(x)
x = Dense(64, activation='relu', name="dense_three")(x)

# and finally we add the main logistic regression layer
main_loss = Dense(1, activation='sigmoid', name='main_output')(x)
model = Model(input=[main_input, auxiliary_input], output=[main_loss, auxiliary_loss])
model.compile(optimizer='rmsprop', loss='binary_crossentropy',
              loss_weights=[1., 0.2])
model.get_layer("dense_one")

Thank you.

Has the option to give a name to the layers in the Sequential model been removed?

I want to set name of keras model based on functional API. please tell me how to do it.

Same problem here

I forgot to post solution here for functional api. Please follow the link to solution for naming model.
https://stackoverflow.com/questions/51810185/how-to-name-keras-model-based-on-functional-api-solved

And if you want to want to name a layer then do something similar to this:

conv_1 = Convolution1D(filters =num_filters, kernel_size=filter_width, activation='tanh', name='Conv1D_{}_{}'.format(num_filters, filter_width))(x)

global_maxpool_1=GlobalMaxPooling1D(name='GBMaxpooling_{}_{}'.format(num_filters, filter_width))(conv_1)

How to set a layer name like "Mul". i.e. - tf.op ...

Was this page helpful?
0 / 5 - 0 ratings

Related issues

amityaffliction picture amityaffliction  路  3Comments

oweingrod picture oweingrod  路  3Comments

kylemcdonald picture kylemcdonald  路  3Comments

anjishnu picture anjishnu  路  3Comments

NancyZxll picture NancyZxll  路  3Comments