Good day! Faced a problem with a custom dataset with a little number of images ~700.
Trying to train FCOS_x101 on 4 GPUs. On the first view due to a low number of spets learning rate decay proceeds too fast. Add gamma parameter doesn't help a lot.
Is there any way to increase/decrease the number of steps for learning rate decay?
step=[16, 22]) #Increase it to [32, 44] --> [64, 88]
total_epochs = 12 #Scale it by x2 at the step manners?
total_epochs is the total epoch number for the whole training, and you can set arbitrary steps at which you want to decrease the lr.
total_epochsis the total epoch number for the whole training, and you can set arbitrary steps at which you want to decrease the lr.
Thanks a lot! No doubts about total_epoch, but talking about setting up of the arbitrary steps - where we can do it? At least at the config file couldn't notice this parameter.
You have already figured it out. step=[xx, yy] means deceasing the lr at xx and yy epochs respectively, and you can even add more steps.
Thanks a lot! Feel free to close it
Most helpful comment
You have already figured it out.
step=[xx, yy]means deceasing the lr at xx and yy epochs respectively, and you can even add more steps.