Keras: Is 0.0 acceptable for hte l1 and l2 parameters of ActivityRegularization layer?

Created on 9 Feb 2018  Â·  4Comments  Â·  Source: keras-team/keras

The documentation says l1 and l2 parameter of ActivityRegularization layer should be “positive float”, but the default values for them are both 0.0. Shouldn’t the documentation should say “nonnegative float”?

All 4 comments

sure should. But why there is an constraint on the sign at all?

I checked, and ActivityRegularization does not complain about negative values, which I think is a good thing, but leaving it undocumented for now.

total loss = loss(w)+ r||w|| where loss is [mse, ce, etc...], r is the activity regularizer constant, w is the weights, and || || is some norm.

If r is negative you can minimize the total loss by setting all of the weights to infinity.

Regularizers are a penalty to reduce the weights, which is why r >= 0

If r is negative you can minimize the total loss by setting all of the weights to infinity.

not if loss(w) -> +inf faster. mse, ce aren't the only losses possible.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

Imorton-zd picture Imorton-zd  Â·  3Comments

KeironO picture KeironO  Â·  3Comments

zygmuntz picture zygmuntz  Â·  3Comments

MarkVdBergh picture MarkVdBergh  Â·  3Comments

amityaffliction picture amityaffliction  Â·  3Comments