Pytorch: what is exactly batch_size in pytorch?

Created on 30 Jul 2017  Â·  1Comment  Â·  Source: pytorch/pytorch

Sorry im new to this.
I am not sure if I understand right. in pytorch it says: batch_size (int, optional) – how many samples per batch to load (default: 1).
I know that, batch size = the number of training examples in one forward/backward pass.
What does it mean that it says "how many samples per batch to load". can you define sample and batch here for me please.
Also, what would be the maximum number for batch_size?

Thanks

Most helpful comment

sample == example

If you have 10 samples or examples in a batch, then the batch size is 10. Maximum batch_size is limited by the memory that your system has -- main memory in case of CPU and GPU memory if you are using the GPU.

Also, please use the PyTorch discussion forum for questions like these, rather than opening an issue.

>All comments

sample == example

If you have 10 samples or examples in a batch, then the batch size is 10. Maximum batch_size is limited by the memory that your system has -- main memory in case of CPU and GPU memory if you are using the GPU.

Also, please use the PyTorch discussion forum for questions like these, rather than opening an issue.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

szagoruyko picture szagoruyko  Â·  3Comments

kdexd picture kdexd  Â·  3Comments

Coderx7 picture Coderx7  Â·  3Comments

rajarshd picture rajarshd  Â·  3Comments

bartvm picture bartvm  Â·  3Comments