Pytorch-lightning: Support DictConfig

Created on 31 May 2020  Â·  9Comments  Â·  Source: PyTorchLightning/pytorch-lightning

We need to add DictConfig support for Omegaconf @Borda to the auto hparam save

Priority P0 bug / fix enhancement help wanted

Most helpful comment

Yes, I agree. Users should have other options besides argument parser to set up their configuration. Personally speaking, I don't like typing too much on the command line b/c that is error-prone. A dictionary-like configuration system would be great. One example would be Ross's yacs which works pretty well.

All 9 comments

Yes, I agree. Users should have other options besides argument parser to set up their configuration. Personally speaking, I don't like typing too much on the command line b/c that is error-prone. A dictionary-like configuration system would be great. One example would be Ross's yacs which works pretty well.

OmegaConf is along the same lines as Yacs, but with more features (might as well support YACS too, but just saying).

@DKandrew @Darktex you would keep passing one eg OmegaConf argument which is used internally? Kind of pseudocode...

  1. conf = OmegaConf(...) model = MyModel(conf)
  2. conf = OmegaConf(...) model = MyModel(**vars(conf))
    assuming that the conf can be also loaded from a file...

Hi @Borda
I am not sure if I understand your question correctly, are you asking which approach you listed above is better?

I am not sure if I understand your question correctly, are you asking which approach you listed above is better?

I am trying to understand your use-case, mind draw your use case?

Sure. My use-case is the first one

conf = OmegaConf(...)
model = MyModel(conf)

I don't want to use the second case because conf contains too many entries: entries for network layers, dataloader, training, optimizer/scheduler, etc. I don't want to expand all of them into my __init__() because that would be too long. After all, MyModel is just an nn.Module with additional features, so internally I have a self.conf to store the entire config and use it whenever I need it.

class MyModel(LightningModule)
    def __init__(self, conf):
        self.conf = conf
        # Normal network parameters like in_features, out_features

let’s do this:

case 1

User explicitly says what they want to save.

class LitModel(...):

    def __init__(self, conf):
        self.save_hyperparameters(conf)

Case 2:

User wants to save all the init stuff.
They can do it all manually or ask us to do it automatically

class LitModel(...):

    def __init__(self, arg1, arg2, arg3):
        # manually
        self.save_hyperparameters(arg_name=arg1, arg_name=arg2, arg_name=arg3)

        # equivalent automatic
        self.save_hyperparameters()

Case 3:

They want to save ONLY some of the init stuff

class LitModel(...):

    def __init__(self, arg1, arg2, arg3):
        # manually
        self.save_hyperparameters(arg_name=arg2)

Special cases:

  • namespace
    def __init__(self, hparams):
        # manually
        self.save_hyperparameters(hparams)
  • dict
    def __init__(self, some_dict):
        # manually
        self.save_hyperparameters(some_dict)
  • omniconf
    def __init__(self, conf):
        # manually
        self.save_hyperparameters(conf)
  • anything they want
    def __init__(self, some_random_alternative_to_config):
        # manually
        self.save_hyperparameters(some_random_alternative_to_config)

@PyTorchLightning/core-contributors

let’s do this:

Sounds good to me. Looking forward to it!

case 1

User explicitly says what they want to save.

class LitModel(...):

    def __init__(self, conf):
        self.save_hyperparameters(conf)

is very tricky as we would need to do some pairing from init frame and hope none of the conf1, conf2, ... has the same value so I would skip this case...

Special cases:

  • namespace
  • dict
    def __init__(self, conf):
        # manually
        self.save_hyperparameters(conf)

here you want to unroll all elements?
assume you have some_dict=dict(a=1, b=3) then you would in fact do something similar like self.save_hyperparameters(**some_dict)

if fact

def save_hyperparameters(**kwargs):
    ...
    for elms in kwargs:
        if isisntance(conf, (dict, OmegaConf)):
            self.save_hyperparameters(**conf)
        if isisntance(conf, Namespace):
            self.save_hyperparameters(**vars(conf))
       ...
Was this page helpful?
0 / 5 - 0 ratings

Related issues

maxime-louis picture maxime-louis  Â·  3Comments

as754770178 picture as754770178  Â·  3Comments

edenlightning picture edenlightning  Â·  3Comments

polars05 picture polars05  Â·  3Comments

baeseongsu picture baeseongsu  Â·  3Comments