Pytorch: `parameters()` should be a list, not an iterator

Created on 3 Oct 2017  路  1Comment  路  Source: pytorch/pytorch

Making parameters() return an iterator gives ample opportunities to shoot yourself in the foot, e.g.:

# broken gradient clipping
a = model.parameters()
norm = 0
for p in a:
  norm += p.norm()
for p in a:
  p /= norm

Also, printing gradients by writing list(m.parameters())[0] is annoying.

Most helpful comment

I don't think I agree. A lot of things like this are generators/iterators in Python3 (think map, filter, zip, range, ...) and you can shoot yourself in the same way. If you want to iterate over the thing multiple times just do list(model.parameters()) and if you just want the first one do next(model.parameters()) (which is also going to be much faster, because it won't iterate over all of your model at all).

>All comments

I don't think I agree. A lot of things like this are generators/iterators in Python3 (think map, filter, zip, range, ...) and you can shoot yourself in the same way. If you want to iterate over the thing multiple times just do list(model.parameters()) and if you just want the first one do next(model.parameters()) (which is also going to be much faster, because it won't iterate over all of your model at all).

Was this page helpful?
0 / 5 - 0 ratings

Related issues

a1363901216 picture a1363901216  路  3Comments

bartvm picture bartvm  路  3Comments

soumith picture soumith  路  3Comments

kdexd picture kdexd  路  3Comments

szagoruyko picture szagoruyko  路  3Comments