Pytorch-lightning: Validation Loop shall pass the hidden state

Created on 27 May 2020  路  11Comments  路  Source: PyTorchLightning/pytorch-lightning

馃殌 Feature

validation_step function only receives the batch and batch_idx, while the training_step also receives the hiddens when working with sequences. But, when we are evaluating our recurrent LMs, we pass the hidden state through the validation steps and the pytorch-lightning do not permit this.

enhancement help wanted won't fix

All 11 comments

Hi! thanks for your contribution!, great first issue!

training_step also only gets batch and batch_idx. Do you mean the amount of values that your dataset gives back?
Similar to #1888 ?

Hi @HansBambel, this is not actually true when you are working with sequences, as you can see in pytorch_lightning/trainer/__init__.py#L953

Oh, you're right. Haven't used LSTMs in pytorch lightning yet. Sorry for the confusion!

Don't worry! I am having some issues when using recurrent networks. Actually, I think would be better if we were able to pass between the lifecycle any object we wan't haha.

you can just assign the objects you want to self

self.thing_i_need = ...

Yes, of course I can, but it is brittle, don't you think?

Both training_step and validation_step should have at least the same api regarding the hidden state

@igormq what API do you propose?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

+1

+1

Was this page helpful?
0 / 5 - 0 ratings

Related issues

jcreinhold picture jcreinhold  路  3Comments

mmsamiei picture mmsamiei  路  3Comments

awaelchli picture awaelchli  路  3Comments

maxime-louis picture maxime-louis  路  3Comments

edenlightning picture edenlightning  路  3Comments