I set up one layer ,for example ,fc1=mx.symbol.Fullyconnected(data=data,name='fc1',num_hidden=64)
There are two parameters of this layer : weight and bias。I want to set different 'lr_mult' for these two
parameters . for example,'lr_mult'='0.1' for weight ,and 'lr_mult'='1' for bias.
What should I write the code for this purpose ? Thank you .
fc1=mx.symbol.Fullyconnected(data=data,name='fc1',num_hidden=64, attr={'weight_lr_mult':'0.1', 'bias_lr_mult':'1.0'})
@piiswrong I used this code .How,maybe I got the wrong attributions as follows.

Why I got the 'fc1_bias_bias_lr_mult' ,'fc1_weight_weight_lr_mult' and so on.
those are harmless
On Aug 7, 2016 7:49 PM, "lugui2009" [email protected] wrote:
@piiswrong https://github.com/piiswrong I used this code .How,maybe I
got the wrong attributions as follows.
[image: mymistake]
https://cloud.githubusercontent.com/assets/20158014/17467932/85417470-5d55-11e6-9d61-36447665cf45.jpg
Why I got the 'fc1_bias_bias_lr_mult' ,'fc1_weight_weight_lr_mult' and so
on.—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/dmlc/mxnet/issues/2932#issuecomment-238130761, or mute
the thread
https://github.com/notifications/unsubscribe-auth/AAiudH5mD6JiZkCg02BBfztk2gjTDvIzks5qdplJgaJpZM4JdVLp
.
@piiswrong Thank you very much .I see
This issue is closed due to lack of activity in the last 90 days. Feel free to reopen if this is still an active issue. Thanks!
It looks some changes are. You can use a simple var
w=mx.sym.var(lr_mult=0.1)
b=mx.sym.var(lr_mult=0.1)
fc1=mx.symbol.Fullyconnected(data=data,name='fc1',num_hidden=64, weight=w, bias=b)
Most helpful comment