Pytorch-lightning: How to log hparams to Tensorboard?

Created on 24 Mar 2020  路  11Comments  路  Source: PyTorchLightning/pytorch-lightning

Hello! I'm trying to view my hparams on tensorboard, but can't actually see them there. As I understood from documentation, to log hparams one should add self.hparams in the __init__ of the LightningModule. Here's what I'm doing:

class MyModule(pl.LightningModule):
    def __init__(self,hparams):
        super().__init__()
        self.hparams =  Namespace(**{'learning_rate': hparams.learning_rate,
                                    'batch_size':hparams.batch_size,
                                    'normalize_embeddings':hparams.normalize_embeddings,
                                    'normalize': hparams.normalize,
                                    'k_neighbors':hparams.k_neighbors,
                                    'melspec_dir':hparams.melspec_dir})

My hparams also contain Trainer hparams, so to log only the right ones, I wrote this workaround. But anyway, I could not see any of my hparams. What could be the problem?

question

Most helpful comment

Ok, I did it. It looks like that TensorBoard won't display hparams if some previous runs were without them. So I had to start a new tb process with a clean logs directory.
Everything works normal now.

All 11 comments

Hi,

I am struggling with hparams since 2 days now too.
I do this:

class MyModule(pl.LightningModule):

    def __init__(self, hparams: dict):
        super(MyModule, self).__init__()
        self.hparams = hparams

There might be some magic going on when the module is loaded.
Did you try defining your Namespace outside, then passing it as hparams?

Thanks for the reply!

Did you try defining your Namespace outside, then passing it as hparams?

hparams, that I pass to __init__ of my module is already a Namespace that contains many hyperparameters, including those that I want to log in tensorboard.

I never tried it with a namespace, I always use dicts.
Did you try it with a dict?

No, I didn't
According to #651 hparams should be a Namespace object, so I assume that will not work, but I'll check.

1228 is not related. That is about reporting metrics such as validation loss together with the set of hyperparameters.

Do you know when self.logger.log_hyperparams(hparams) is called? Does it happen at the end of training or every epoch?

when you overwrite it with a print you can see that it gets called at the very beginning of the training (not training step or epoch, the whole training).

I'm still not able to log any hparams to Tensorboard, although I exactly follow the documentation and do this:

class MyModule(pl.LightningModule):
  def __init__(self, hparams):
    super().__init__()
    self.hparams = hparams

,where hparams, that I'm passing to init are parser.parse_args()
Does anybody have similar problems?

I tried explicitly add hparams to tb like this:

 self.logger.experiment.add_hparams(
                                   hparam_dict= {'learning_rate': self.hparams.learning_rate},
                                   metric_dict = dict())

But that didn't work.

Ok, I did it. It looks like that TensorBoard won't display hparams if some previous runs were without them. So I had to start a new tb process with a clean logs directory.
Everything works normal now.

Oh Jesus, I've been wasting so much time debugging Lightning. Thank @RafailFridman .

Was this page helpful?
0 / 5 - 0 ratings

Related issues

baeseongsu picture baeseongsu  路  3Comments

remisphere picture remisphere  路  3Comments

chuong98 picture chuong98  路  3Comments

maxime-louis picture maxime-louis  路  3Comments

jcreinhold picture jcreinhold  路  3Comments