This will trigger the error:
testset = dset.ImageFolder(root='data/test', transform=transform_test)
test_loader = DataLoader(testset, batch_size=8, shuffle=True, num_workers=2)
loader = iter(test_loader)
next(loader)
Traceback:
RuntimeError: Traceback (most recent call last):
File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 40, in _worker_loop
samples = collate_fn([dataset[i] for i in batch_indices])
File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 109, in default_collate
return [default_collate(samples) for samples in transposed]
File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 109, in <listcomp>
return [default_collate(samples) for samples in transposed]
File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 91, in default_collate
return torch.stack(batch, 0, out=out)
File "/home/ubuntu/anaconda3/lib/python3.5/site-packages/torch/functional.py", line 66, in stack
return torch.cat(inputs, dim, out=out)
RuntimeError: inconsistent tensor sizes at /home/ubuntu/pytorch/torch/lib/TH/generic/THTensorMath.c:2625
Running today's Pytorch (32e666551a695cb0fee8c70474a13f3def042bb2).
Since the code seems trivial, it is probably related to the context. Any ideas on where to look?
Update: I have noticed that changing transform_test
made the error go away, looking into that now.
Loading data with this transform fails:
self.input_size = 224
self.transform_test = transforms.Compose([
transforms.Scale(self.input_size),
transforms.ToTensor(),
transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
])
but the following transforms does not cause an error:
self.transform_test = transforms.Compose([
transforms.Scale((self.input_w, self.input_h)),
transforms.ToTensor(),
transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)),
])
Ah, this comes down to Scale
not scaling to a square but maintaining ratio, hence different sized input images will not fit when catting a tensor..
Most helpful comment
Ah, this comes down to
Scale
not scaling to a square but maintaining ratio, hence different sized input images will not fit when catting a tensor..