Hi, all!
Does lightning call update_tng_log_metrics function right before it logs metric for the batch as described in the docs ?
I wanted loss (in training steps) to be displayed as loss/train in tensorboard.
To do so, I planned to use update_tng_log_metrics and copy logs['loss'] to logs['loss/train'], and there I noticed that the function may not called.
Here is my code.
class myLightningModule(pl.LightningModule):
# REQUIRED functions
def training_step(self, batch, batch_nb):
inputs, targets = batch
outputs = self.forward(inputs)
return {'loss':self.loss_func(outputs, targets)}
def update_tng_log_metrics(self, logs):
logs['loss/train'] = logs['loss']
return logs
To look into logs, I tried dump logs inside update_tng_log_metrics function by print it. I didn't got any output, not even a None.
Old docs... has been cleaned up.
To modify metrics use:
self.experiment.whatever_you_want_to_do
Most helpful comment
Old docs... has been cleaned up.
To modify metrics use:
self.experiment.whatever_you_want_to_do