I could not find the way to do it, maybe there is something I've been missing.
Hi! thanks for your contribution!, great first issue!
this is a tensorboard limitation unfortunately
not possible in TB as far as I know. with some of the other loggers you can do that in the UI (I know wandb can).
just as far as I know, it is possible in catalyst, and I found tutorial in medium where author plot cos and sin in the same pytorch+tensorboard figure
https://medium.com/@rktkek456/pytorch-tensorboard-tutorial-for-a-beginner-b037ee66574a
I see now, this is different. There they call add_scalars (instead of add_scalar) and this creates multiple experiments (see screenshots) and then it is possible to select them to overlap the plots.
So in PL you could do
# in training step
self.logger.experiment.add_scalars("losses", {"train_loss": loss})
# in val/test step
self.logger.experiment.add_scalars("losses", {"val_loss": loss})
Maybe we can support this as part of the output dict of trainingstep etc. but I'm not sure what will happen since other loggers won't support this nested dict structure.
EDIT
In fact, there is an old issue here: #665
Thank you! I see now, I will try just with calling logger, so I will close this issue once there is one the same
@awaelchli hi, how do the same but have epoch on x-axis, not steps?
I don't know what it will look like, but try:
self.logger.experiment.add_scalars("losses", {"val_loss": loss}, global_step=self.current_epoch)
I am not very familiar with TB.
Most helpful comment
Thank you! I see now, I will try just with calling logger, so I will close this issue once there is one the same