Keras: Batch Normalization Is Along Wrong Axis

Created on 8 Mar 2016  路  1Comment  路  Source: keras-team/keras

I've taken a look at Keras' batch normalization code and it looks like it's normalising along the wrong axis

https://github.com/fchollet/keras/blob/master/keras/layers/normalization.py#L64

This means that the axis parameter specifies the dimension to leave alone rather than the dimensions to compute the mean and std over

Most helpful comment

It's very simple:

For Dense layer, all RNN layers and most other types of layers, the default of axis=-1 is what you should use,
For Convolution2D layers with dim_ordering="th" (the default), use axis=1,
For Convolution2D layers with dim_ordering="tf", use axis=-1 (i.e. the default).

>All comments

It's very simple:

For Dense layer, all RNN layers and most other types of layers, the default of axis=-1 is what you should use,
For Convolution2D layers with dim_ordering="th" (the default), use axis=1,
For Convolution2D layers with dim_ordering="tf", use axis=-1 (i.e. the default).

Was this page helpful?
0 / 5 - 0 ratings

Related issues

kylemcdonald picture kylemcdonald  路  3Comments

nryant picture nryant  路  3Comments

amityaffliction picture amityaffliction  路  3Comments

Imorton-zd picture Imorton-zd  路  3Comments

farizrahman4u picture farizrahman4u  路  3Comments