Sorry im new to this.
I am not sure if I understand right. in pytorch it says: batch_size (int, optional) – how many samples per batch to load (default: 1).
I know that, batch size = the number of training examples in one forward/backward pass.
What does it mean that it says "how many samples per batch to load". can you define sample and batch here for me please.
Also, what would be the maximum number for batch_size?
Thanks
sample == example
If you have 10 samples or examples in a batch, then the batch size is 10. Maximum batch_size is limited by the memory that your system has -- main memory in case of CPU and GPU memory if you are using the GPU.
Also, please use the PyTorch discussion forum for questions like these, rather than opening an issue.
Most helpful comment
sample == example
If you have 10 samples or examples in a batch, then the batch size is 10. Maximum batch_size is limited by the memory that your system has -- main memory in case of CPU and GPU memory if you are using the GPU.
Also, please use the PyTorch discussion forum for questions like these, rather than opening an issue.