Since **kwargs was removed from Trainer's init in #1820, initializing Trainer objects fails if you have any non Trainer specific arguments in your parser.
If this is the expected behavior, the docs should be updated to reflect the workaround I mention below, as a few of them would currently fail.
parser = ArgumentParser()
parser = Trainer.add_argparse_args(parser)
args = parser.parse_args("")
trainer = Trainer.from_argparse_args(args)
parser = ArgumentParser()
parser = Trainer.add_argparse_args(parser)
parser.add_argument('--script_specific_arg', type=str, default='hope this works')
args = parser.parse_args('')
trainer = Trainer.from_argparse_args(args)
class SomeModel(LightningModule):
def __init__(self, hparams):
super().__init__()
self.hparams = hparams
@staticmethod
def add_model_specific_args(parent_parser):
parser = ArgumentParser(parents=[parent_parser], add_help=False)
parser.add_argument('--some_argument', type=int, default=128)
return parser
parser = ArgumentParser()
parser = Trainer.add_argparse_args(parser)
parser = SomeModel.add_model_specific_args(parser)
args = parser.parse_args("")
trainer = Trainer.from_argparse_args(args)
parser = ArgumentParser()
parser = Trainer.add_argparse_args(parser)
# Grab only the trainer args and init it right away
temp_args, _ = parser.parse_known_args('')
trainer = Trainer.from_argparse_args(temp_args)
parser.add_argument('--script_specific_arg', type=str, default='hope this works')
args = parser.parse_args("")
Trainer.from_argparse_args should ignore unknown kwargs.
0.7.7.dev0)0.7.6)@awaelchli let鈥檚 add back kwargs? why did we remove them?
We removed it because
Trainer(checkpont_calbuck=False)
would be accepted even though it's misspelled. This is bad, so we have to remove the kwargs from Trainer.
The issue described here is not with Trainer. It is with from_argparse_args, it should pick only the valid args and ignore the others.
For example, in from_argparse_args we could inspect the Trainer and get a list of accepted args. If that's possible, this would solve the problem.
Here's how we could do it:
from pytorch_lightning import Trainer
variables = Trainer.__init__.__code__.co_varnames
print(variables)
We get a list of accepted args
@awaelchli Yeah, I figured there would be a way to just filter out the good args. It'll mess up your error that you had when passing in bad args...but perhaps that's for the best?
Thank you!
It'll mess up your error that you had when passing in bad args...but perhaps that's for the best?
Yes I'm fine with that if in the argparser there is a misspelled arg, but most users will probably anyway construct their parser with Trainer.add_argparse_args.
In the PR I made it so that the manually passed in override args are forced to be valid, since they don't come from CLI, e.g.
# no error even if argparse_args contains unknown args
Trainer.from_argparse_args(argparse_args, checkpoint_callback=False)
# error
Trainer.from_argparse_args(argparse_args, chckpint_calback=False)
What do you think? Is it reasonable?
Ah, I see what you were getting at. I'll try to leave a review today if I have time. On mobile now, but at a quick glance the code looks good :).