Is this a bug? torch.__version__ is '0.1.11+b13b701' .
Works fine for me with (almost) the latest version ('0.1.11+8aa1cef'
)
import torch
import torch.nn as nn
from torch.autograd import Variable
y = Variable(torch.rand(5, 3), requires_grad=True)
t = Variable(torch.LongTensor(5).random_(0, 2))
m = nn.MultiMarginLoss()
loss = m(y, t)
loss.backward()
print(y.grad)
outputs
Variable containing:
-0.1333 0.0667 0.0667
0.0667 -0.1333 0.0667
0.0667 -0.1333 0.0667
0.0667 -0.1333 0.0667
0.0667 -0.1333 0.0667
[torch.FloatTensor of size 5x3]
Hi,
The nn.Module
does not have a backward (none of them have), their forward is implemented with autograd compliant methods and is thus automatically differentiated.
If you want to find the implementation for MultiMarginLoss
, it is implemented here in c.
Thanks. I'm just getting started with PyTorch. I understand it now.
Most helpful comment
Hi,
The
nn.Module
does not have a backward (none of them have), their forward is implemented with autograd compliant methods and is thus automatically differentiated.If you want to find the implementation for
MultiMarginLoss
, it is implemented here in c.