Pytorch-lightning: Using multiple loggers

Created on 6 Oct 2019  路  2Comments  路  Source: PyTorchLightning/pytorch-lightning

Describe the solution you'd like
Trainer could receive a list of loggers instead of only one, all loggers would be called at the appropriate times.

Describe alternatives you've considered
Alternatively could create a customLogger that logged the information to all relevant destinations (eg. tensorboard AND mlflow)

Logger enhancement help wanted

Most helpful comment

What about using the existing base logger to create a composite logger that takes a list of loggers and calls them all?

All 2 comments

What about using the existing base logger to create a composite logger that takes a list of loggers and calls them all?

Would it be helpful to add an example of this to the docs?

Was this page helpful?
0 / 5 - 0 ratings