Hi, guys!
I want to know how to get the batch size of the model in MMDetection?
Because I did not see it in a config file.
Any answer would be appreciated!
imgs_per_gpu=2
imgs_per_gpu=2
@ZwwWayne Have you ever done the experiment about batchsize=1, 2 or 4 in object detection task and which is often better? I don't know why batchsize=1 can also get very good results. Thanks a lot.
FYI:
imgs_per_gpu is now deprecated and renamed as samples_per_gpu, written on config files in configs/ _ base _ /datasets/
Most helpful comment
imgs_per_gpu=2