The progress bar is very slick but a big problem with it is that it overwrites itself. For example, if you are at epoch 10, you cannot see what the validation and training losses were for epoch 9. Could the progress bar perhaps be made to work more like in Keras so that you can see the losses of accuracies of previous epochs?
Hi! thanks for your contribution!, great first issue!
Yes, I was thinking of adding a flag to the progressbar callback that could enable that, but holding back on it because we were talking about switching to fastprogress.
If you need it asap, simply try to inherit from from ProgressBar callback, set the leave = True in the tqdm bar and pass your new callback to the trainer, like so: Trainer(callbacks=[myprogressbar])
I tried this
class MyProgressBar(ProgressBar):
def init_validation_tqdm(self):
bar = super().init_validation_tqdm()
bar.leave = True
return bar
def init_train_tqdm(self):
bar = super().init_train_tqdm()
bar.leave = True
return bar
def init_test_tqdm(self):
bar = super().init_test_tqdm()
bar.leave = True
return bar
It seem to make a newline for the validation progress bar. But I can't see the values for training and validation losses and accuracies. I don't really need a progress bar - just those for values for each epoch.
okay, then you could try to access trainer.progress_bar_metrics (dict) and print them yourself on epoch end maybe?
I'm a PyTorchLightning newbie so I don't know how to do that but I'll try.
I think the easiest way is to:
def on_epoch_start(self):
print('\n')
It will end this line and start a new tqdm line.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Most helpful comment
I think the easiest way is to:
It will end this line and start a new tqdm line.