Keras: How to transfer learn a two-stream Net?

Created on 12 Jan 2018  Â·  7Comments  Â·  Source: keras-team/keras

I want to use pretrained vgg to train a twostream net.
imagea---vgg19a fc2a ↘
fc3 --- softmax
imgaeb---vgg19b fc2b ↗
But the pretrained func does not support this net?

Most helpful comment

You can change the layer's name in keras, don't use 'tensorflow.python.keras'.
Here is my sample code:

```from keras.models import Model
from keras.layers import Dense, concatenate
from keras.applications import vgg16

num_classes = 10

model = vgg16.VGG16(include_top=False, weights='imagenet', input_tensor=None, input_shape=(64,64,3), pooling='avg')
inp = model.input
out = model.output

model2 = vgg16.VGG16(include_top=False,weights='imagenet', input_tensor=None, input_shape=(64,64,3), pooling='avg')

for layer in model2.layers:
layer.name = layer.name + str("_2")

inp2 = model2.input
out2 = model2.output

merged = concatenate([out, out2])
merged = Dense(1024, activation='relu')(merged)
merged = Dense(num_classes, activation='softmax')(merged)

model_fusion = Model([inp, inp2], merged)
model_fusion.summary()

All 7 comments

Have you tried using the Keras functional API with the pre-trained VGG19 model instances? If yes, can you share a snippet of code so we can understand what you are trying to do better?

These links may be useful. Please have a look,

Also, please try asking your question on StackOverflow or join the Keras Slack channel.

@brainnoise THX, I have writed the two vgg net in tf, but use func model
with pretrained model shows

RuntimeError: ('The name "conv1" is used 2 times in the model. All layer names should be unique. Layer names: ', ['input_1', 'input_2', 'conv1', 'conv1', 'bn_conv1', 'bn_conv1', 'activation_1', 'activation_50', 'max_pooling2d_1', 'max_pooling2d_2', 'res2a_branch2a', 'res2a_branch2a', 'bn2a_branch2a', 'bn2a_branch2a', 'activation_2', 'activation_51', 'res2a_branch2b', 'res2a_branch2b', 'bn2a_branch2b', 'bn2a_branch2b', 'activation_3', 'activation_52', 'res2a_branch2c', 'res2a_branch1', 'res2a_branch2c', 'res2a_branch1', 'bn2a_branch2c', 'bn2a_branch1', 'bn2a_branch2c', 'bn2a_branch1', 'add_1', 'add_17', 'activation_4', 'activation_53', 'res2b_branch2a', 'res2b_branch2a', 'bn2b_branch2a', 'bn2b_branch2a', 'activation_5', 'activation_54', 'res2b_branch2b', 'res2b_branch2b', 'bn2b_branch2b', 'bn2b_branch2b', 'activation_6', 'activation_55', 'res2b_branch2c', 'res2b_branch2c', 'bn2b_branch2c', 'bn2b_branch2c', 'add_2', 'add_18', 'activation_7', 'activation_56', 'res2c_branch2a', 'res2c_branch2a', 'bn2c_branch2a', 'bn2c_branch2a', 'activation_8', 'activation_57', 'res2c_branch2b', 'res2c_branch2b', 'bn2c_branch2b', 'bn2c_branch2b', 'activation_9', 'activation_58', 'res2c_branch2c', 'res2c_branch2c', 'bn2c_branch2c', 'bn2c_branch2c', 'add_3', 'add_19', 'activation_10', 'activation_59', 'res3a_branch2a', 'res3a_branch2a', 'bn3a_branch2a', 'bn3a_branch2a', 'activation_11', 'activation_60', 'res3a_branch2b', 'res3a_branch2b', 'bn3a_branch2b', 'bn3a_branch2b', 'activation_12', 'activation_61', 'res3a_branch2c', 'res3a_branch1', 'res3a_branch2c', 'res3a_branch1', 'bn3a_branch2c', 'bn3a_branch1', 'bn3a_branch2c', 'bn3a_branch1', 'add_4', 'add_20', 'activation_13', 'activation_62', 'res3b_branch2a', 'res3b_branch2a', 'bn3b_branch2a', 'bn3b_branch2a', 'activation_14', 'activation_63', 'res3b_branch2b', 'res3b_branch2b', 'bn3b_branch2b', 'bn3b_branch2b', 'activation_15', 'activation_64', 'res3b_branch2c', 'res3b_branch2c', 'bn3b_branch2c', 'bn3b_branch2c', 'add_5', 'add_21', 'activation_16', 'activation_65', 'res3c_branch2a', 'res3c_branch2a', 'bn3c_branch2a', 'bn3c_branch2a', 'activation_17', 'activation_66', 'res3c_branch2b', 'res3c_branch2b', 'bn3c_branch2b', 'bn3c_branch2b', 'activation_18', 'activation_67', 'res3c_branch2c', 'res3c_branch2c', 'bn3c_branch2c', 'bn3c_branch2c', 'add_6', 'add_22', 'activation_19', 'activation_68', 'res3d_branch2a', 'res3d_branch2a', 'bn3d_branch2a', 'bn3d_branch2a', 'activation_20', 'activation_69', 'res3d_branch2b', 'res3d_branch2b', 'bn3d_branch2b', 'bn3d_branch2b', 'activation_21', 'activation_70', 'res3d_branch2c', 'res3d_branch2c', 'bn3d_branch2c', 'bn3d_branch2c', 'add_7', 'add_23', 'activation_22', 'activation_71', 'res4a_branch2a', 'res4a_branch2a', 'bn4a_branch2a', 'bn4a_branch2a', 'activation_23', 'activation_72', 'res4a_branch2b', 'res4a_branch2b', 'bn4a_branch2b', 'bn4a_branch2b', 'activation_24', 'activation_73', 'res4a_branch2c', 'res4a_branch1', 'res4a_branch2c', 'res4a_branch1', 'bn4a_branch2c', 'bn4a_branch1', 'bn4a_branch2c', 'bn4a_branch1', 'add_8', 'add_24', 'activation_25', 'activation_74', 'res4b_branch2a', 'res4b_branch2a', 'bn4b_branch2a', 'bn4b_branch2a', 'activation_26', 'activation_75', 'res4b_branch2b', 'res4b_branch2b', 'bn4b_branch2b', 'bn4b_branch2b', 'activation_27', 'activation_76', 'res4b_branch2c', 'res4b_branch2c', 'bn4b_branch2c', 'bn4b_branch2c', 'add_9', 'add_25', 'activation_28', 'activation_77', 'res4c_branch2a', 'res4c_branch2a', 'concatenate_1', 'dense_1'])

base_model = ResNet50(weights='imagenet')
base_model2= ResNet50(weights='imagenet')
x=Concatenate(axis=-1)([base_model.get_layer('res4c_branch2a').output,base_model2.get_layer('res4c_branch2a').output])
x=Dense(8, activation='relu')(x)
model = Model(inputs=[base_model.input,base_model2.input], outputs=x)

It seems that pretrained model can not be used to build a twostream
net. Maybe I should build a two stream net and initialize then get varaiables from h5/npy.

@Windaway Did you solve the problem how to build a two stream net using pre-trained model? I met the same problem.Thank you for your reply.

@DianaLi96
I used tf.slim to solve the problem with tensorlayer.
Use tf.slim to build a graph, and restore the params, and save the stacked params to npy with tensorlayer. Build a two stream graph and restore params with tensorlayer api.

You can change the layer's name in keras, don't use 'tensorflow.python.keras'.
Here is my sample code:

```from keras.models import Model
from keras.layers import Dense, concatenate
from keras.applications import vgg16

num_classes = 10

model = vgg16.VGG16(include_top=False, weights='imagenet', input_tensor=None, input_shape=(64,64,3), pooling='avg')
inp = model.input
out = model.output

model2 = vgg16.VGG16(include_top=False,weights='imagenet', input_tensor=None, input_shape=(64,64,3), pooling='avg')

for layer in model2.layers:
layer.name = layer.name + str("_2")

inp2 = model2.input
out2 = model2.output

merged = concatenate([out, out2])
merged = Dense(1024, activation='relu')(merged)
merged = Dense(num_classes, activation='softmax')(merged)

model_fusion = Model([inp, inp2], merged)
model_fusion.summary()

@pvhduc97 . I believe that in your solution the VGG models do not share weights. What if we want them to share weights?

@pvhduc97 . I believe that in your solution the VGG models do not share weights. What if we want them to share weights?

if u want to share weight,how to use a two-stream Net?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

oweingrod picture oweingrod  Â·  3Comments

fredtcaroli picture fredtcaroli  Â·  3Comments

MarkVdBergh picture MarkVdBergh  Â·  3Comments

zygmuntz picture zygmuntz  Â·  3Comments

anjishnu picture anjishnu  Â·  3Comments