Hi all,
I had trained a net with inpute size 416x416. Now, I want to test with other inpute size, such as 832x832, 640x480, 500x500, 208x208, or other size, but it did not work. The error is as follow:
layer filters size input output
0 conv 32 3 x 3 / 1 500 x 500 x 3 -> 500 x 500 x 32 0.432 BFLOPs
1 conv 64 3 x 3 / 2 500 x 500 x 32 -> 250 x 250 x 64 2.304 BFLOPs
2 conv 32 1 x 1 / 1 250 x 250 x 64 -> 250 x 250 x 32 0.256 BFLOPs
3 conv 64 3 x 3 / 1 250 x 250 x 32 -> 250 x 250 x 64 2.304 BFLOPs
4 res 1 250 x 250 x 64 -> 250 x 250 x 64
5 conv 128 3 x 3 / 2 250 x 250 x 64 -> 125 x 125 x 128 2.304 BFLOPs
6 conv 64 1 x 1 / 1 125 x 125 x 128 -> 125 x 125 x 64 0.256 BFLOPs
7 conv 128 3 x 3 / 1 125 x 125 x 64 -> 125 x 125 x 128 2.304 BFLOPs
8 res 5 125 x 125 x 128 -> 125 x 125 x 128
9 conv 64 1 x 1 / 1 125 x 125 x 128 -> 125 x 125 x 64 0.256 BFLOPs
10 conv 128 3 x 3 / 1 125 x 125 x 64 -> 125 x 125 x 128 2.304 BFLOPs
11 res 8 125 x 125 x 128 -> 125 x 125 x 128
12 conv 256 3 x 3 / 2 125 x 125 x 128 -> 63 x 63 x 256 2.341 BFLOPs
13 conv 128 1 x 1 / 1 63 x 63 x 256 -> 63 x 63 x 128 0.260 BFLOPs
14 conv 256 3 x 3 / 1 63 x 63 x 128 -> 63 x 63 x 256 2.341 BFLOPs
15 res 12 63 x 63 x 256 -> 63 x 63 x 256
16 conv 128 1 x 1 / 1 63 x 63 x 256 -> 63 x 63 x 128 0.260 BFLOPs
17 conv 256 3 x 3 / 1 63 x 63 x 128 -> 63 x 63 x 256 2.341 BFLOPs
18 res 15 63 x 63 x 256 -> 63 x 63 x 256
19 conv 128 1 x 1 / 1 63 x 63 x 256 -> 63 x 63 x 128 0.260 BFLOPs
20 conv 256 3 x 3 / 1 63 x 63 x 128 -> 63 x 63 x 256 2.341 BFLOPs
21 res 18 63 x 63 x 256 -> 63 x 63 x 256
22 conv 128 1 x 1 / 1 63 x 63 x 256 -> 63 x 63 x 128 0.260 BFLOPs
23 conv 256 3 x 3 / 1 63 x 63 x 128 -> 63 x 63 x 256 2.341 BFLOPs
24 res 21 63 x 63 x 256 -> 63 x 63 x 256
25 conv 128 1 x 1 / 1 63 x 63 x 256 -> 63 x 63 x 128 0.260 BFLOPs
26 conv 256 3 x 3 / 1 63 x 63 x 128 -> 63 x 63 x 256 2.341 BFLOPs
27 res 24 63 x 63 x 256 -> 63 x 63 x 256
28 conv 128 1 x 1 / 1 63 x 63 x 256 -> 63 x 63 x 128 0.260 BFLOPs
29 conv 256 3 x 3 / 1 63 x 63 x 128 -> 63 x 63 x 256 2.341 BFLOPs
30 res 27 63 x 63 x 256 -> 63 x 63 x 256
31 conv 128 1 x 1 / 1 63 x 63 x 256 -> 63 x 63 x 128 0.260 BFLOPs
32 conv 256 3 x 3 / 1 63 x 63 x 128 -> 63 x 63 x 256 2.341 BFLOPs
33 res 30 63 x 63 x 256 -> 63 x 63 x 256
34 conv 128 1 x 1 / 1 63 x 63 x 256 -> 63 x 63 x 128 0.260 BFLOPs
35 conv 256 3 x 3 / 1 63 x 63 x 128 -> 63 x 63 x 256 2.341 BFLOPs
36 res 33 63 x 63 x 256 -> 63 x 63 x 256
37 conv 512 3 x 3 / 2 63 x 63 x 256 -> 32 x 32 x 512 2.416 BFLOPs
38 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
39 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
40 res 37 32 x 32 x 512 -> 32 x 32 x 512
41 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
42 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
43 res 40 32 x 32 x 512 -> 32 x 32 x 512
44 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
45 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
46 res 43 32 x 32 x 512 -> 32 x 32 x 512
47 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
48 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
49 res 46 32 x 32 x 512 -> 32 x 32 x 512
50 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
51 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
52 res 49 32 x 32 x 512 -> 32 x 32 x 512
53 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
54 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
55 res 52 32 x 32 x 512 -> 32 x 32 x 512
56 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
57 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
58 res 55 32 x 32 x 512 -> 32 x 32 x 512
59 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
60 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
61 res 58 32 x 32 x 512 -> 32 x 32 x 512
62 conv 1024 3 x 3 / 2 32 x 32 x 512 -> 16 x 16 x1024 2.416 BFLOPs
63 conv 512 1 x 1 / 1 16 x 16 x1024 -> 16 x 16 x 512 0.268 BFLOPs
64 conv 1024 3 x 3 / 1 16 x 16 x 512 -> 16 x 16 x1024 2.416 BFLOPs
65 res 62 16 x 16 x1024 -> 16 x 16 x1024
66 conv 512 1 x 1 / 1 16 x 16 x1024 -> 16 x 16 x 512 0.268 BFLOPs
67 conv 1024 3 x 3 / 1 16 x 16 x 512 -> 16 x 16 x1024 2.416 BFLOPs
68 res 65 16 x 16 x1024 -> 16 x 16 x1024
69 conv 512 1 x 1 / 1 16 x 16 x1024 -> 16 x 16 x 512 0.268 BFLOPs
70 conv 1024 3 x 3 / 1 16 x 16 x 512 -> 16 x 16 x1024 2.416 BFLOPs
71 res 68 16 x 16 x1024 -> 16 x 16 x1024
72 conv 512 1 x 1 / 1 16 x 16 x1024 -> 16 x 16 x 512 0.268 BFLOPs
73 conv 1024 3 x 3 / 1 16 x 16 x 512 -> 16 x 16 x1024 2.416 BFLOPs
74 res 71 16 x 16 x1024 -> 16 x 16 x1024
75 conv 512 1 x 1 / 1 16 x 16 x1024 -> 16 x 16 x 512 0.268 BFLOPs
76 conv 1024 3 x 3 / 1 16 x 16 x 512 -> 16 x 16 x1024 2.416 BFLOPs
77 conv 512 1 x 1 / 1 16 x 16 x1024 -> 16 x 16 x 512 0.268 BFLOPs
78 conv 1024 3 x 3 / 1 16 x 16 x 512 -> 16 x 16 x1024 2.416 BFLOPs
79 conv 512 1 x 1 / 1 16 x 16 x1024 -> 16 x 16 x 512 0.268 BFLOPs
80 conv 1024 3 x 3 / 1 16 x 16 x 512 -> 16 x 16 x1024 2.416 BFLOPs
81 conv 18 1 x 1 / 1 16 x 16 x1024 -> 16 x 16 x 18 0.009 BFLOPs
82 yolo
83 route 79
84 conv 256 1 x 1 / 1 16 x 16 x 512 -> 16 x 16 x 256 0.067 BFLOPs
85 upsample 2x 16 x 16 x 256 -> 32 x 32 x 256
86 route 85 61
87 conv 256 1 x 1 / 1 32 x 32 x 768 -> 32 x 32 x 256 0.403 BFLOPs
88 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
89 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
90 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
91 conv 256 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 256 0.268 BFLOPs
92 conv 512 3 x 3 / 1 32 x 32 x 256 -> 32 x 32 x 512 2.416 BFLOPs
93 conv 18 1 x 1 / 1 32 x 32 x 512 -> 32 x 32 x 18 0.019 BFLOPs
94 yolo
95 route 91
96 conv 128 1 x 1 / 1 32 x 32 x 256 -> 32 x 32 x 128 0.067 BFLOPs
97 upsample 2x 32 x 32 x 128 -> 64 x 64 x 128
98 route 97 36
99 Layer before convolutional layer must output image.: File exists
darknet: ./src/utils.c:256: error: Assertion `0' failed.
Aborted (core dumped)
What's the problem? If the inpute size can be modified to other size when testing, like Caffe?
Any help will be grateful!
We have solved the problem. It needs the size to satisfy the route layer. So you can modify the inpute size to 480, 512 and so on.
Most helpful comment
We have solved the problem. It needs the size to satisfy the route layer. So you can modify the inpute size to 480, 512 and so on.