Keras: UserWarning: Discrepancy between trainable weights and collected trainable weights, did you set `model.trainable` without calling `model.compile` after ?

Created on 4 Mar 2018  路  1Comment  路  Source: keras-team/keras

Hi I'm training VGG16 network for my own dataset. Following has given the code I used.

from keras.models import Sequential
from scipy.misc import imread
#get_ipython().magic('matplotlib inline')
import matplotlib.pyplot as plt
import numpy as np
import keras
from keras.layers import Dense
import pandas as pd
from keras.applications.vgg16 import VGG16
from keras.preprocessing import image
from keras.applications.vgg16 import preprocess_input
import numpy as np
from keras.applications.vgg16 import decode_predictions
from keras.utils.np_utils import to_categorical

from sklearn.preprocessing import LabelEncoder
from keras.models import Sequential
from keras.optimizers import SGD
from keras.layers import Input, Dense, Convolution2D, MaxPooling2D, AveragePooling2D, ZeroPadding2D, Dropout, Flatten, merge, Reshape, Activation
import os
from sklearn.metrics import log_loss
import cv2
from keras.models import Model
from sklearn import cross_validation
from imagenet_utils import preprocess_input
from imagenet_utils import preprocess_input,decode_predictions
import tensorflow as tf
from keras.backend.tensorflow_backend import set_session

os.environ['CUDA_VISIBLE_DEVICES'] = '-1'

def train_test_separation():
    '''
    print ('rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr')
    df = pd.read_csv('../resnet_50/dbs2017/data/stage1_labels.csv')
    print(df.head())

    #print (filenames,' ',pathname,' ',BASE_PATH )
    #images=load_images()
    labeling = df['cancer'].as_matrix()
    print ('hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh')
    names=['Not_Cancer','Cancer']

    Y_category=keras.utils.to_categorical(labeling, num_classes=2)

    print ('.......................................',Y_category)

    x = np.array([np.mean(np.load('E:/224x224/%s.npy' % str(id)), axis=0) for id in df['id'].tolist()])
    trn_dataset, test_dataset, trn_labels, test_labels = cross_validation.train_test_split(x, Y_category, random_state=2,
                                                                   test_size=0.20)

    # Load our model
    model = vgg16_model(img_rows, img_cols, channel, num_classes)

    model.summary()
    # Start Fine-tuning
    model.fit(trn_dataset, trn_labels,batch_size=batch_size,epochs=nb_epoch,shuffle=True,verbose=1,validation_data=(X_valid, Y_valid))

    # Make predictions
    predictions_valid = model.predict(X_valid, batch_size=batch_size, verbose=1)

    # Cross-entropy loss score
    score = log_loss(Y_valid, predictions_valid)

    return custom_vgg_model

if __name__ == '__main__':
    #calc_features()
    train_test_separation()

But during the training I see something unusual. I see my dataset is not getting trained.
I only changed softmax classifier layer and freezed all other layer above that. When displaying my model summary after that, I recieved a user waring as shown in the heading.

_________________________________________________________________
block5_conv1 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv2 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_conv3 (Conv2D)        (None, 14, 14, 512)       2359808   
_________________________________________________________________
block5_pool (MaxPooling2D)   (None, 7, 7, 512)         0         
_________________________________________________________________
flatten (Flatten)            (None, 25088)             0         
_________________________________________________________________
fc1 (Dense)                  (None, 4096)              102764544 
_________________________________________________________________
fc2 (Dense)                  (None, 4096)              16781312  
_________________________________________________________________
output (Dense)               (None, 2)                 8194      
=================================================================

Warning (from warnings module):
  File "C:\Research\Python_installation\lib\site-packages\keras\engine\training.py", line 973
    'Discrepancy between trainable weights and collected trainable'
UserWarning: Discrepancy between trainable weights and collected trainable weights, did you set `model.trainable` without calling `model.compile` after ?
Total params: 268,529,282
Trainable params: 134,268,738
Non-trainable params: 134,260,544
_________________________________________________________________
Train on 1107 samples, validate on 277 samples
Epoch 1/1

  50/1107 [>.............................] - ETA: 31:43 - loss: 0.7271 - acc: 0.6800
 100/1107 [=>............................] - ETA: 28:14 - loss: 5.3602 - acc: 0.5300
 150/1107 [===>..........................] - ETA: 26:16 - loss: 7.7642 - acc: 0.4267
 200/1107 [====>.........................] - ETA: 24:44 - loss: 8.8050 - acc: 0.3850
 250/1107 [=====>........................] - ETA: 23:21 - loss: 9.1716 - acc: 0.3760
 300/1107 [=======>......................] - ETA: 22:06 - loss: 9.8458 - acc: 0.3433
 350/1107 [========>.....................] - ETA: 20:37 - loss: 10.2353 - acc: 0.3257
 400/1107 [=========>....................] - ETA: 19:04 - loss: 10.4468 - acc: 0.3175
 450/1107 [===========>..................] - ETA: 17:38 - loss: 10.6829 - acc: 0.3067
 500/1107 [============>.................] - ETA: 16:12 - loss: 10.7429 - acc: 0.3060

If the model works fine then the

Non-trainable params: 134,260,544

count should not be 134,260,544, but 8194. But I got it as 134,260,544 instead of 8194 with the user warning.

When displayed, the predict output of my validated dataset, it was like below.

[[0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]
 [0. 1.]

This means the data set has not classified correctly at all.

Can someone please help me to identify the fault that I have made.

Most helpful comment

This happens if you are printing the progress bar in a small sized terminal. Make the terminal full screen or increase the size of the terminal window and the progress bar works fine.

>All comments

This happens if you are printing the progress bar in a small sized terminal. Make the terminal full screen or increase the size of the terminal window and the progress bar works fine.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

amityaffliction picture amityaffliction  路  3Comments

Imorton-zd picture Imorton-zd  路  3Comments

oweingrod picture oweingrod  路  3Comments

LuCeHe picture LuCeHe  路  3Comments

NancyZxll picture NancyZxll  路  3Comments