Keras: what is this mean? "model.add(Dense(32,input_dim=16)), now the model will take as input array of shape(*,16),and output arrays of shape(*,32)",why is (*,32)? I think the input_array's output arrays of shape is(*,16)??

Created on 6 May 2016  路  10Comments  路  Source: keras-team/keras

Please make sure that the boxes below are checked before you submit your issue. Thank you!

  • [ ] Check that you are up-to-date with the master branch of Keras. You can update with:
    pip install git+git://github.com/fchollet/keras.git --upgrade --no-deps
  • [ ] If running on Theano, check that you are up-to-date with the master branch of Theano. You can update with:
    pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
  • [ ] Provide a link to a GitHub Gist of a Python script that can reproduce your issue (or just copy the script here if it is short).

Most helpful comment

It will be easier to understand with a NN image.

Just found an example in this website:

http://keras.dhpit.com/

model.add(Dense(12, input_dim=8, init='uniform', activation='relu'))

It means 8 input parameters, with 12 neurons in the FIRST hidden layer.

nn

All 10 comments

in keras , the first layer is the first hidden_layer ?

I assume you have a data table (row_numbers, column_numbers)
so , 16 is column numbers ,it must take that as input data (well python counts from 0 by the way).
then right after this "Dense(" comes "32" , this 32 is classes you want to categorize your data.
Frankly speaking, I do not like the way KERAS implement it either. It is confusing. Why don't they do :
model = Sequential()
model.add(Dense(input_layer_neurons=16, hidden_layer_neurons=32, kernel_initializer='normal', activation='relu'))

It will be easier to understand with a NN image.

Just found an example in this website:

http://keras.dhpit.com/

model.add(Dense(12, input_dim=8, init='uniform', activation='relu'))

It means 8 input parameters, with 12 neurons in the FIRST hidden layer.

nn

thanks franfran

Thanks to @franfran

Thanks!!

thanks man!!!!

@franfran : thanks a lot! the link cleared alot!!

thats good

Thanks franfran

Was this page helpful?
0 / 5 - 0 ratings