Incubator-mxnet: Composing loss with symbols.. Possible ? My attempt

Created on 25 May 2016  路  5Comments  路  Source: apache/incubator-mxnet

Hello,
After composing layers (which worked) I'd like to compose a loss with symbols.
I'm especially after the autograd feature..

Now I set out to compose a MSE loss.
This is what I came up with.
Example is regression on 4 values
Sum(axis=1) is emulated with sum_mid_internal + reshape.
`
label_var = mx.sym.Variable("input_label")

diff_squared = mxnet.sym.square(output - label_var)
diff_squared_shaped = mx.symbol.Reshape(diff_squared, shape=(BATCH_SIZE, 4, 1))
diff_squared_sum = mxnet.sym.sum_mid_internal(diff_squared_shaped)
mean_squared = diff_squared_sum / 4
loss = mx.symbol.MakeLoss(mean_squared)

`

The principe seems to do something only in the metric evaluation I get 4 label values but 1 output value.
This is logical given my network and I'm but I'm interested if it's possible to compose a loss function with symbols and make it work correctly.

Most helpful comment

Would it be possible for you to share a working example of MakeLoss being used to implement a simple loss function (like sum-squared-error).

I am trying to use it to train for a more interesting loss function, but I think an example of a standard loss function using MakeLoss would be very helpful for me and others to understand the API.

All 5 comments

if you compose loss this way your output is metric so no need to compute metric again.
use a dummy metric like the Torch metric for printing.

Thanks.. I'll try if it works as expected.
If so this opens up a lot of possibilities for me..

Was this the prescribed way for composing a loss with symbols ?

You are doing it the right way. We just need to better document it @precedenceguo @antinucleon

Ok nice..
With some more low-level primitives this is very potent stuff.
Personnaly I like it better than theano etc. Esp with all the deployment targets.
I would not mind contributing to the documentation here and there but I'm still insecure if I'm doing things the way they ought to be done.

Would it be possible for you to share a working example of MakeLoss being used to implement a simple loss function (like sum-squared-error).

I am trying to use it to train for a more interesting loss function, but I think an example of a standard loss function using MakeLoss would be very helpful for me and others to understand the API.

Was this page helpful?
0 / 5 - 0 ratings