Pytorch-lightning: TensorBoardLogger should be able to add metric names in hparams

Created on 10 Mar 2020  路  9Comments  路  Source: PyTorchLightning/pytorch-lightning

馃殌 Feature

TensorBoard allows investigating the effect of hyperparameters in the hparams tab. Unfortunately, the log_hyperparams function in TensorBoardLogger cannot add any information about which of the logged metrics is actually a "metric" which can be used for such a comparison.

Motivation

I would like to use the built-in hparams module of TensorBoard to evaluate my trainings.

Pitch

PyTorch-Lightning should give me the possibility to define the metrics of my model in some way such that any logger is able to derive which metric may be used for hyperparameter validation, as well as other possible characteristics which may be defined for those.

Additional context

The hparams method of a summary takes the following parameters:

def hparams(hparam_dict=None, metric_dict=None):

metric_dict is basically a dictionary mapping metric names to values, whereas the values are omitted in the function itself.

enhancement help wanted won't fix

Most helpful comment

I think if Lightning offers such a logger mechanism, it should offer an abstraction to enable this functionality. I'd be fine with having a register_metric function in TensorBoardLogger, but I don't want to rely on implementation details of the underlying logging mechanism.

All 9 comments

Hi! thanks for your contribution!, great first issue!

Since this is specific to tensorboard and other loggers handle hparams and metrics differently, it is better to use the SummaryWriter object directly. You can always do that with
self.logger.experiment.add_hparams(hparam_dict, metric_dict) within your LightningModule.

I think if Lightning offers such a logger mechanism, it should offer an abstraction to enable this functionality. I'd be fine with having a register_metric function in TensorBoardLogger, but I don't want to rely on implementation details of the underlying logging mechanism.

@tstumm that sounds good to me, would you mind to send a PR?
cc: @PyTorchLightning/core-contributors

@tstumm with logger you can access directly to the base TensorBoard so whatever is allowed there you could be able to do also here... May point some example of this Tensofoard functionality/use-case?

It was introduced here recently: #1630
Feel free to reopen if issues remain.

@awaelchli Is there a plan to automatically log all metrics for hparams tab in TensorBoard? I mean all metrics returned in log key inside methods like validation_step using newly merged TensorBoardLogger().log_hyperparams()?

I'm not up to date with the logger features atm. Will reopen to keep track of your suggestion and also because I just saw that there is still a bugfix in the works here: #1647

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

DavidRuhe picture DavidRuhe  路  3Comments

mmsamiei picture mmsamiei  路  3Comments

jcreinhold picture jcreinhold  路  3Comments

Vichoko picture Vichoko  路  3Comments

williamFalcon picture williamFalcon  路  3Comments