A simple and flexible way to store hyperparameters in a dict/Namespace-like object.
An object that behaves like this:
# just like Namespace
hparams = Hyperparameters(x=1, y=2)
# or from a dict
hparams = Hyperparameters({"x": 1, "y": 2})
# it could support nesting
hparams = Hyperparameters({"x": 1, "y": {"a": 3}})
# Namespace-like look up
x = hparams.x
a = hparams.y.a
# or like a dict
x = hparams["x"]
a = hparams["y"]["a"]
# we will have to check for invalid keys
hparams["batch_size"] # ok
hparams["batch-size"] # error
hparams["batch size"] # error
Optional features:
# could support reading from files
# useful for checkpoint loading
hparams = Hyperparameters.from_yaml("file.yml")
# could support flattening/sanitizing nested structure
# useful for logging to TensorBoard
# or to make it pickleable
clean = hparams.flatten().sanitize()
# note that these internal methods will prevent these keys:
hparams.flatten = True # Problem, because not hyperparameter!
Pro:
Contra:
Considerations
Discussed on slack and idea has popped up in other isses as well.
Related to #1841, #1735 and would solve #1737
Also post more ideas here.
About the name we can have inside auto replacement of any separator (, ,) by _
Should we just use OmegaConf for this? It might make it easier to integrate Hydra down the line too.
Completely agree with @yukw777 . OmegaConf does all of the above and better, PL shouldn't reinvent the wheel.
I guess the idea was to provide a basic datastructure that works without dependency on another library. A simple wheel that rolls and does nothing more, created with tools that we already have :)
Maybe we could use ConfigParser? It's part of the standard library, so no external dependency. Its interface is not as simple as what @awaelchli has envisioned though.
We need to do our arg casting so it would be nice to wrap it in one object
https://github.com/PyTorchLightning/pytorch-lightning/blob/981169cacc5da17af8d796d94b747c17566ef0bc/pytorch_lightning/trainer/trainer.py#L723
We need to do our arg casting so it would be nice to wrap it in one object
What do you get from the Namespace object that you are not getting from OmegaConf DictConfig?
In other words, DictConfig is most likely a drop-in replacement to Namespace. (with the exception of explicit type checks).
I didn't know OmegaConf when I opened the issue.
Anyway, the PL team decided to simplify the way hyperparameters are passed into the model so there is no need for an object like this anymore. Closing this. Thanks for the discussion.
Maybe pydantic? (just my 2c).
Most helpful comment
Maybe we could use ConfigParser? It's part of the standard library, so no external dependency. Its interface is not as simple as what @awaelchli has envisioned though.