Mask_rcnn: Training cause ERROR:root:Error processing image

Created on 4 Jun 2018  Â·  19Comments  Â·  Source: matterport/Mask_RCNN

I am training images on my own dataset and used VGG Image Annotator (VIA) for segmenting images but it cause error when i ran

model.train(dataset_train, dataset_val, 
            learning_rate=config.LEARNING_RATE, 
            epochs=1, 
            layers='heads')

Here's the traceback:

Starting at epoch 0. LR=0.001

Checkpoint Path: C:\Users\Instructor\Desktop\Mask_RCNN\samples\landing\logs\landing20180604T1145\mask_rcnn_landing_{epoch:04d}.h5
Selecting layers to train
fpn_c5p5               (Conv2D)
fpn_c4p4               (Conv2D)
fpn_c3p3               (Conv2D)
fpn_c2p2               (Conv2D)
fpn_p5                 (Conv2D)
fpn_p2                 (Conv2D)
fpn_p3                 (Conv2D)
fpn_p4                 (Conv2D)
In model:  rpn_model
    rpn_conv_shared        (Conv2D)
    rpn_class_raw          (Conv2D)
    rpn_bbox_pred          (Conv2D)
mrcnn_mask_conv1       (TimeDistributed)
mrcnn_mask_bn1         (TimeDistributed)
mrcnn_mask_conv2       (TimeDistributed)
mrcnn_mask_bn2         (TimeDistributed)
mrcnn_class_conv1      (TimeDistributed)
mrcnn_class_bn1        (TimeDistributed)
mrcnn_mask_conv3       (TimeDistributed)
mrcnn_mask_bn3         (TimeDistributed)
mrcnn_class_conv2      (TimeDistributed)
mrcnn_class_bn2        (TimeDistributed)
mrcnn_mask_conv4       (TimeDistributed)
mrcnn_mask_bn4         (TimeDistributed)
mrcnn_bbox_fc          (TimeDistributed)
mrcnn_mask_deconv      (TimeDistributed)
mrcnn_class_logits     (TimeDistributed)
mrcnn_mask             (TimeDistributed)
c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\tensorflow\python\ops\gradients_impl.py:100: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.
  "Converting sparse IndexedSlices to a dense Tensor of unknown shape. "
Epoch 1/1
  7/100 [=>............................] - ETA: 9:47 - loss: 2.7469 - rpn_class_loss: 0.1161 - rpn_bbox_loss: 1.0056 - mrcnn_class_loss: 1.1608 - mrcnn_bbox_loss: 0.2483 - mrcnn_mask_loss: 0.2161     
ERROR:root:Error processing image {'path': 'C:\\Users\\Instructor\\Desktop\\Mask_RCNN\\samples\\landing\\dataset\\train\\flying 12.jpg', 'polygons': [{'name': 'polygon', 'all_points_x': [2, 102, 175, 214, 238, 275, 328, 391, 413, 435, 489, 519, 556, 551, 538, 557, 599, 629, 643, 652, 658, 685, 710, 711, 716, 721, 733, 759, 778, 797, 804, 808, 813, 840, 852, 864, 856, 856, 876, 883, 882, 870, 856, 832, 813, 788, 758, 746, 718, 705, 701, 710, 725, 715, 726, 734, 716, 713, 713, 686, 671, 655, 636, 630, 610, 607, 598, 614, 614, 610, 613, 1, 2], 'all_points_y': [660, 668, 677, 670, 644, 613, 610, 598, 587, 602, 588, 556, 524, 503, 473, 447, 456, 446, 428, 399, 368, 353, 342, 322, 294, 271, 258, 243, 241, 235, 208, 190, 173, 167, 181, 195, 207, 224, 230, 254, 296, 316, 347, 338, 337, 325, 325, 340, 368, 392, 415, 448, 456, 467, 485, 498, 511, 533, 548, 570, 569, 573, 588, 608, 615, 636, 646, 663, 691, 708, 718, 721, 660]}, {'name': 'polygon', 'all_points_x': [607, 612, 613, 597, 606, 609, 628, 634, 650, 667, 682, 714, 714, 713, 735, 714, 723, 711, 700, 705, 718, 755, 781, 854, 882, 882, 875, 853, 852, 863, 837, 810, 796, 752, 721, 707, 655, 642, 627, 595, 555, 536, 552, 486, 431, 411, 382, 325, 269, 211, 170, -1, -1, 1279, 1277, 607], 'all_points_y': [720, 693, 663, 644, 638, 617, 608, 586, 575, 566, 569, 548, 531, 511, 495, 464, 456, 447, 414, 388, 368, 327, 324, 346, 295, 254, 232, 222, 207, 196, 166, 171, 234, 244, 268, 341, 367, 428, 444, 458, 446, 471, 523, 588, 602, 587, 600, 609, 612, 669, 676, 659, 2, 2, 720, 720]}], 'width': 1280, 'id': 'flying 12.jpg', 'source': 'landing', 'height': 720}
Traceback (most recent call last):
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1695, in data_generator
    use_mini_mask=config.USE_MINI_MASK)
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1210, in load_image_gt
    mask, class_ids = dataset.load_mask(image_id)
  File "<ipython-input-4-071b0f25d21e>", line 66, in load_mask
    mask[rr, cc, i] = 1
IndexError: index 720 is out of bounds for axis 0 with size 720
 15/100 [===>..........................] - ETA: 8:15 - loss: 2.1900 - rpn_class_loss: 0.1246 - rpn_bbox_loss: 1.0832 - mrcnn_class_loss: 0.6205 - mrcnn_bbox_loss: 0.1760 - mrcnn_mask_loss: 0.1857
ERROR:root:Error processing image {'path': 'C:\\Users\\Instructor\\Desktop\\Mask_RCNN\\samples\\landing\\dataset\\train\\flying 22.jpg', 'polygons': [{'name': 'polygon', 'all_points_x': [1086, 1279, 1278, 198, 279, 422, 419, 381, 372, 144, 153, 281, 196, 0, 1, 815, 805, 849, 900, 891, 932, 994, 1086], 'all_points_y': [716, 718, 2, 1, 112, 109, 128, 129, 149, 151, 111, 114, 1, 1, 719, 718, 665, 641, 593, 557, 576, 639, 716]}, {'name': 'polygon', 'all_points_x': [154, 143, 371, 379, 418, 419, 154], 'all_points_y': [113, 147, 149, 129, 128, 111, 113]}, {'name': 'polygon', 'all_points_x': [811, 808, 842, 874, 902, 893, 934, 987, 1086, 809, 811], 'all_points_y': [717, 663, 644, 616, 594, 555, 578, 635, 717, 722, 717]}], 'width': 1280, 'id': 'flying 22.jpg', 'source': 'landing', 'height': 720}
Traceback (most recent call last):
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1695, in data_generator
    use_mini_mask=config.USE_MINI_MASK)
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1210, in load_image_gt
    mask, class_ids = dataset.load_mask(image_id)
  File "<ipython-input-4-071b0f25d21e>", line 66, in load_mask
    mask[rr, cc, i] = 1
IndexError: index 720 is out of bounds for axis 0 with size 720
 27/100 [=======>......................] - ETA: 6:36 - loss: 2.2996 - rpn_class_loss: 0.1214 - rpn_bbox_loss: 1.0273 - mrcnn_class_loss: 0.4554 - mrcnn_bbox_loss: 0.3613 - mrcnn_mask_loss: 0.3342
ERROR:root:Error processing image {'path': 'C:\\Users\\Instructor\\Desktop\\Mask_RCNN\\samples\\landing\\dataset\\train\\flying 06.jpg', 'polygons': [{'name': 'polygon', 'all_points_x': [381, 430, 423, 412, 355, 305, 283, 290, 320, 364, 408, 464, 482, 489, 520, 584, 623, 682, 734, 783, 818, 818, 1014, 1039, 1027, 1035, 1057, 1078, 1081, 1091, 1078, 1050, 1012, 820, 812, 828, 815, 842, 848, 860, 894, 889, 861, 852, 881, 902, 1003, 1281, 1275, 1, 3, 381], 'all_points_y': [718, 684, 670, 661, 692, 662, 635, 592, 555, 522, 505, 501, 467, 442, 433, 447, 428, 430, 426, 403, 417, 417, 282, 252, 223, 205, 208, 204, 240, 264, 311, 318, 283, 419, 441, 479, 512, 555, 553, 585, 606, 627, 634, 661, 678, 669, 721, 719, 1, 3, 719, 718]}, {'name': 'polygon', 'all_points_x': [383, 434, 423, 410, 355, 302, 281, 290, 319, 361, 406, 465, 480, 488, 517, 555, 582, 619, 680, 737, 772, 785, 819, 810, 822, 827, 810, 823, 836, 849, 857, 882, 891, 887, 858, 852, 883, 901, 1003, 619, 598, 578, 560, 533, 522, 507, 494, 490, 383], 'all_points_y': [716, 682, 667, 660, 689, 656, 630, 591, 553, 522, 507, 501, 471, 440, 435, 444, 445, 430, 431, 427, 409, 402, 417, 436, 461, 479, 513, 533, 553, 553, 585, 600, 603, 626, 632, 659, 677, 668, 719, 716, 688, 715, 693, 688, 702, 694, 708, 719, 716]}, {'name': 'polygon', 'all_points_x': [1021, 1014, 1035, 1050, 1074, 1082, 1086, 1078, 1079, 1077, 1056, 1035, 1026, 1037, 1030, 1021], 'all_points_y': [270, 284, 302, 314, 306, 283, 260, 240, 216, 203, 206, 203, 217, 245, 264, 270]}], 'width': 1280, 'id': 'flying 06.jpg', 'source': 'landing', 'height': 720}
Traceback (most recent call last):
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1695, in data_generator
    use_mini_mask=config.USE_MINI_MASK)
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1210, in load_image_gt
    mask, class_ids = dataset.load_mask(image_id)
  File "<ipython-input-4-071b0f25d21e>", line 66, in load_mask
    mask[rr, cc, i] = 1
IndexError: index 720 is out of bounds for axis 0 with size 720
 45/100 [============>.................] - ETA: 4:57 - loss: 2.3798 - rpn_class_loss: 0.1041 - rpn_bbox_loss: 0.9614 - mrcnn_class_loss: 0.3517 - mrcnn_bbox_loss: 0.5000 - mrcnn_mask_loss: 0.4627
ERROR:root:Error processing image {'path': 'C:\\Users\\Instructor\\Desktop\\Mask_RCNN\\samples\\landing\\dataset\\train\\flying 05.jpg', 'polygons': [{'name': 'polygon', 'all_points_x': [307, 421, 423, 414, 314, 298, 291, 293, 287, 280, 288, 280, 270, 276, 281, 307, 340, 381, 418, 467, 476, 473, 549, 607, 690, 705, 731, 788, 795, 808, 802, 808, 824, 816, 806, 817, 846, 847, 853, 875, 889, 894, 888, 893, 889, 855, 929, 1023, 1061, 1016, 1282, 1281, 4, 3, 307], 'all_points_y': [718, 660, 645, 636, 677, 653, 624, 641, 623, 614, 607, 589, 525, 504, 468, 441, 424, 433, 469, 454, 426, 407, 393, 383, 402, 397, 412, 392, 390, 401, 418, 419, 437, 455, 479, 509, 522, 540, 559, 577, 582, 585, 597, 597, 602, 624, 653, 698, 717, 715, 717, -1, -1, 718, 718]}, {'name': 'polygon', 'all_points_x': [291, 314, 407, 411, 421, 421, 422, 365, 330, 306, 478, 472, 478, 485, 485, 484, 483, 484, 488, 492, 496, 506, 520, 526, 533, 549, 558, 568, 576, 588, 602, 603, 668, 737, 775, 856, 1063, 850, 886, 882, 892, 891, 853, 849, 846, 830, 817, 810, 805, 818, 821, 809, 809, 801, 809, 801, 795, 786, 774, 735, 708, 690, 606, 559, 475, 479, 474, 464, 416, 377, 341, 308, 279, 274, 270, 270, 276, 278, 278, 289, 277, 290, 291], 'all_points_y': [640, 672, 640, 634, 642, 650, 658, 690, 706, 720, 719, 706, 704, 704, 698, 694, 688, 680, 675, 668, 660, 660, 668, 656, 650, 654, 668, 676, 690, 669, 678, 692, 699, 686, 680, 719, 716, 623, 602, 609, 576, 591, 560, 541, 523, 518, 511, 493, 477, 453, 438, 417, 415, 418, 403, 395, 391, 394, 396, 412, 398, 401, 385, 394, 406, 423, 441, 458, 471, 432, 424, 439, 472, 504, 521, 534, 592, 595, 595, 606, 616, 626, 640]}, {'name': 'polygon', 'all_points_x': [586, 576, 567, 553, 546, 534, 526, 521, 507, 498, 486, 487, 477, 476, 480, 586], 'all_points_y': [717, 690, 679, 667, 656, 654, 656, 670, 662, 662, 678, 699, 705, 704, 719, 717]}], 'width': 1280, 'id': 'flying 05.jpg', 'source': 'landing', 'height': 720}
Traceback (most recent call last):
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1695, in data_generator
    use_mini_mask=config.USE_MINI_MASK)
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1210, in load_image_gt
    mask, class_ids = dataset.load_mask(image_id)
  File "<ipython-input-4-071b0f25d21e>", line 66, in load_mask
    mask[rr, cc, i] = 1
IndexError: index 1280 is out of bounds for axis 1 with size 1280
 71/100 [====================>.........] - ETA: 2:37 - loss: 2.2839 - rpn_class_loss: 0.0836 - rpn_bbox_loss: 0.9078 - mrcnn_class_loss: 0.2698 - mrcnn_bbox_loss: 0.5224 - mrcnn_mask_loss: 0.5003
ERROR:root:Error processing image {'path': 'C:\\Users\\Instructor\\Desktop\\Mask_RCNN\\samples\\landing\\dataset\\train\\flying 07.jpg', 'polygons': [{'name': 'polygon', 'all_points_x': [380, 429, 422, 411, 354, 304, 282, 293, 324, 367, 405, 467, 482, 491, 516, 586, 626, 682, 741, 786, 817, 817, 1013, 1038, 1026, 1034, 1056, 1077, 1080, 1090, 1077, 1049, 1011, 819, 811, 827, 814, 841, 847, 859, 885, 888, 860, 851, 880, 901, 1002, 1280, 1276, 4, 2, 380], 'all_points_y': [718, 684, 670, 661, 692, 662, 635, 595, 565, 530, 513, 510, 478, 447, 441, 454, 436, 435, 435, 409, 417, 417, 282, 252, 223, 205, 208, 204, 240, 264, 311, 318, 283, 419, 441, 479, 512, 555, 553, 585, 607, 627, 634, 661, 678, 669, 721, 719, 3, 5, 719, 718]}, {'name': 'polygon', 'all_points_x': [386, 437, 426, 408, 357, 305, 284, 293, 322, 364, 409, 468, 483, 491, 520, 558, 585, 622, 683, 740, 775, 788, 822, 813, 825, 830, 813, 826, 839, 852, 860, 885, 884, 890, 861, 855, 886, 904, 1006, 622, 601, 581, 563, 536, 525, 510, 497, 493, 386], 'all_points_y': [724, 690, 675, 664, 693, 664, 638, 599, 561, 530, 515, 509, 479, 448, 443, 452, 453, 438, 439, 435, 417, 410, 425, 444, 469, 487, 521, 541, 561, 561, 593, 608, 613, 634, 640, 667, 685, 676, 727, 724, 696, 723, 701, 696, 710, 702, 716, 727, 724]}, {'name': 'polygon', 'all_points_x': [1021, 1014, 1035, 1050, 1074, 1082, 1086, 1078, 1079, 1077, 1056, 1035, 1026, 1037, 1030, 1021], 'all_points_y': [270, 284, 302, 314, 306, 283, 260, 240, 216, 203, 206, 203, 217, 245, 264, 270]}], 'width': 1280, 'id': 'flying 07.jpg', 'source': 'landing', 'height': 720}
Traceback (most recent call last):
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1695, in data_generator
    use_mini_mask=config.USE_MINI_MASK)
  File "c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py", line 1210, in load_image_gt
    mask, class_ids = dataset.load_mask(image_id)
  File "<ipython-input-4-071b0f25d21e>", line 66, in load_mask
    mask[rr, cc, i] = 1
IndexError: index 720 is out of bounds for axis 0 with size 720
IndexError                                Traceback (most recent call last)
<ipython-input-11-83fb3ae74319> in <module>()
      6             learning_rate=config.LEARNING_RATE,
      7             epochs=1,
----> 8             layers='heads')

c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py in train(self, train_dataset, val_dataset, learning_rate, epochs, layers, augmentation)
   2326             max_queue_size=100,
   2327             workers=workers,
-> 2328             use_multiprocessing=True,
   2329         )
   2330         self.epoch = max(self.epoch, epochs)

c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\keras\legacy\interfaces.py in wrapper(*args, **kwargs)
     89                 warnings.warn('Update your `' + object_name +
     90                               '` call to the Keras 2 API: ' + signature, stacklevel=2)
---> 91             return func(*args, **kwargs)
     92         wrapper._original_function = func
     93         return wrapper

c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\keras\engine\training.py in fit_generator(self, generator, steps_per_epoch, epochs, verbose, callbacks, validation_data, validation_steps, class_weight, max_queue_size, workers, use_multiprocessing, shuffle, initial_epoch)
   2192                 batch_index = 0
   2193                 while steps_done < steps_per_epoch:
-> 2194                     generator_output = next(output_generator)
   2195 
   2196                     if not hasattr(generator_output, '__len__'):

c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py in data_generator(dataset, config, shuffle, augment, augmentation, random_rois, batch_size, detection_targets)
   1693                 load_image_gt(dataset, config, image_id, augment=augment,
   1694                               augmentation=augmentation,
-> 1695                               use_mini_mask=config.USE_MINI_MASK)
   1696 
   1697             # Skip images that have no instances. This can happen in cases

c:\users\instructor\appdata\local\programs\python\python35\lib\site-packages\mask_rcnn-2.1-py3.5.egg\mrcnn\model.py in load_image_gt(dataset, config, image_id, augment, augmentation, use_mini_mask)
   1208     # Load image and mask
   1209     image = dataset.load_image(image_id)
-> 1210     mask, class_ids = dataset.load_mask(image_id)
   1211     original_shape = image.shape
   1212     image, window, scale, padding, crop = utils.resize_image(

<ipython-input-4-071b0f25d21e> in load_mask(self, image_id)
     64             # Get indexes of pixels inside the polygon and set them to 1
     65             rr, cc = skimage.draw.polygon(p['all_points_y'], p['all_points_x'])
---> 66             mask[rr, cc, i] = 1
     67 
     68         # Return mask, and array of class IDs of each instance. Since we have

IndexError: index 720 is out of bounds for axis 0 with size 720

Most helpful comment

Just in case, if anyone who's getting IndexError stumbles upon this issue list:

As @zungam stated, it's because of polygon points outside the image boundaries. It's not always practical or more effort would be required to fix the masks manually. And, it can be handled in code to handle such exception (ideally, the polygons should not be overflowing image dimensions), with a bit of overhead which can be of little concern if training on huge dataset, one can try to fix in the following manner, this worked for me while training on custom dataset where fixing them manually was not an option for me:

  • print statements are added to figure out the problems (comment these later)
  • fix (overrides the actual array): rr[rr > mask.shape[0]-1] = mask.shape[0]-1 and cc[cc > mask.shape[1]-1] = mask.shape[1]-1
  • here's how the load_mask function looks like:
    def load_mask(self, image_id):
        """Generate instance masks for an image.
       Returns:
        masks: A bool array of shape [height, width, instance count] with
            one mask per instance.
        class_ids: a 1D array of class IDs of the instance masks.
        """
        # If not a road dataset image, delegate to parent class.
        image_info = self.image_info[image_id]
        if image_info["source"] != "road":
            return super(self.__class__, self).load_mask(image_id)

        # Convert polygons to a bitmap mask of shape
        # [height, width, instance_count]
        info = self.image_info[image_id]
        mask = np.zeros([info["height"], info["width"], len(info["polygons"])],
                        dtype=np.uint8)
        for i, p in enumerate(info["polygons"]):
            # Get indexes of pixels inside the polygon and set them to 1
            rr, cc = skimage.draw.polygon(p['all_points_y'], p['all_points_x'])
            print("mask.shape, min(mask),max(mask): {}, {},{}".format(mask.shape, np.min(mask),np.max(mask)))
            print("rr.shape, min(rr),max(rr): {}, {},{}".format(rr.shape, np.min(rr),np.max(rr)))
            print("cc.shape, min(cc),max(cc): {}, {},{}".format(cc.shape, np.min(cc),np.max(cc)))

            ## Note that this modifies the existing array arr, instead of creating a result array
            ## Ref: https://stackoverflow.com/questions/19666626/replace-all-elements-of-python-numpy-array-that-are-greater-than-some-value
            rr[rr > mask.shape[0]-1] = mask.shape[0]-1
            cc[cc > mask.shape[1]-1] = mask.shape[1]-1

            print("After fixing the dirt mask, new values:")        
            print("rr.shape, min(rr),max(rr): {}, {},{}".format(rr.shape, np.min(rr),np.max(rr)))
            print("cc.shape, min(cc),max(cc): {}, {},{}".format(cc.shape, np.min(cc),np.max(cc)))

            mask[rr, cc, i] = 1

        # Return mask, and array of class IDs of each instance. Since we have
        # one class ID only, we return an array of 1s
        return mask.astype(np.bool), np.ones([mask.shape[-1]], dtype=np.int32)

All 19 comments

This error is inside your load_mask function, which is difficult to use. See #608

Why? Because there is no standard way to store masks on PC. Some store them as PNG, some stores them as .json, and many .json files are different. What kind of json file did you put into this?

Anyway, for your particular instance:
You use index 720 in a numpy array which only can use indexes to 0-719. It probably means that your input images are bigger than 720 here.

all of my images in the dataset are 1280x720 which one should I change to make this work?
I used the json file format straight from VIA tool export.

my load_mask() looks like this:

def load_mask(self, image_id):
        """Generate instance masks for an image.
       Returns:
        masks: A bool array of shape [height, width, instance count] with
            one mask per instance.
        class_ids: a 1D array of class IDs of the instance masks.
        """
        # If not a balloon dataset image, delegate to parent class.
        image_info = self.image_info[image_id]
        if image_info["source"] != "landing":
            return super(self.__class__, self).load_mask(image_id)

        # Convert polygons to a bitmap mask of shape
        # [height, width, instance_count]
        info = self.image_info[image_id]
        mask = np.zeros([info["height"], info["width"], len(info["polygons"])],
                        dtype=np.uint8)
        for i, p in enumerate(info["polygons"]):
            # Get indexes of pixels inside the polygon and set them to 1
            rr, cc = skimage.draw.polygon(p['all_points_y'], p['all_points_x'])
            mask[rr, cc, i] = 1

        # Return mask, and array of class IDs of each instance. Since we have
        # one class ID only, we return an array of 1s
        return mask, np.array(info['class_ids'], dtype=np.int32) # np.ones([mask.shape[-1]], dtype=np.int32)

print out rr and cc to see why they are making point outside your image. It could be that the polygon points are outside the image, or that the mask variable isnt big enough. Its likly that you doing something wrong with info["height"] and info["width"]. Perhaps you have switched the two somewhere.

I had the same issue. It turned out while annotating my dataset using the VGG annotation tool, some polygon points were marked outside the image. In your .json file check that no y value exceeds 720 (in your particular case). In the error message, you can see the image name which generated the errors.

Thanks @zungam and @engrchrishenry it seems that some of my polygons are outside the image size and adjusted it to solve the problem

Hello, I encountered a similar error in image segmentation. Can you help me? Thank you
My mistakes are as follows:

fpn_c5p5 (Conv2D)
fpn_c4p4 (Conv2D)
fpn_c3p3 (Conv2D)
fpn_c2p2 (Conv2D)
fpn_p5 (Conv2D)
fpn_p2 (Conv2D)
fpn_p3 (Conv2D)
fpn_p4 (Conv2D)
('In model: ', 'rpn_model')
rpn_conv_shared (Conv2D)
rpn_class_raw (Conv2D)
rpn_bbox_pred (Conv2D)
mrcnn_mask_conv1 (TimeDistributed)
mrcnn_mask_bn1 (TimeDistributed)
mrcnn_mask_conv2 (TimeDistributed)
mrcnn_mask_bn2 (TimeDistributed)
mrcnn_class_conv1 (TimeDistributed)
mrcnn_class_bn1 (TimeDistributed)
mrcnn_mask_conv3 (TimeDistributed)
mrcnn_mask_bn3 (TimeDistributed)
mrcnn_class_conv2 (TimeDistributed)
mrcnn_class_bn2 (TimeDistributed)
mrcnn_mask_conv4 (TimeDistributed)
mrcnn_mask_bn4 (TimeDistributed)
mrcnn_bbox_fc (TimeDistributed)
mrcnn_mask_deconv (TimeDistributed)
mrcnn_class_logits (TimeDistributed)
mrcnn_mask (TimeDistributed)
Epoch 1/10
ERROR:root:Error processing image {'source': 'shapes', 'yaml_path': '/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/labelme_json/1_json/info.yaml', 'width': 500, 'path': '/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/pic/1.jpg', 'mask_path': '/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/cv2_mask/1.png', 'id': 5, 'height': 330}
Traceback (most recent call last):
File "/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/mrcnn/model.py", line 1220, in load_image_gt
mode=config.IMAGE_RESIZE_MODE)
File "/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/mrcnn/utils.py", line 445, in resize_image
preserve_range=True)
File "/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/mrcnn/utils.py", line 889, in resize
preserve_range=preserve_range)
File "/Users/madesheng/anaconda2/lib/python2.7/site-packages/skimage/transform/_warps.py", line 90, in resize
row_scale = float(orig_rows) / rows
ZeroDivisionError: float division by zero
ERROR:root:Error processing image {'source': 'shapes', 'yaml_path': '/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/labelme_json/5_json/info.yaml', 'width': 500, 'path': '/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/pic/5.jpg', 'mask_path': '/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/cv2_mask/5.png', 'id': 1, 'height': 332}
Traceback (most recent call last):
File "/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/mrcnn/model.py", line 1220, in load_image_gt
mode=config.IMAGE_RESIZE_MODE)
File "/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/mrcnn/utils.py", line 445, in resize_image
preserve_range=True)
File "/Users/madesheng/Jobs/Codes/Python/TensorFlow/CNN/ImageInstanceSegmentation/OwnMaskRCNN/own_dataset/mrcnn/utils.py", line 889, in resize
preserve_range=preserve_range)
File "/Users/madesheng/anaconda2/lib/python2.7/site-packages/skimage/transform/_warps.py", line 90, in resize
row_scale = float(orig_rows) / rows
ZeroDivisionError: float division by zero

Hello, can you tell me how do you fixed your error. Thanks

/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py:112: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. "
Epoch 1/30
Exception in thread Thread-3:
Traceback (most recent call last):
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 864, in run
self._target(self._args, *self._kwargs)
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/utils/data_utils.py", line 568, in data_generator_task
generator_output = next(self._generator)
ValueError: generator already executing

[]
ERROR:root:Error processing image {'id': 'cat.28.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.28.jpg', 'class_id': [], 'width': 286, 'height': 270, 'polygons': [{'name': 'polyline', 'all_points_x': [42, 43, 57, 69, 73, 74, 88, 99, 115, 134, 174, 199, 221, 231, 264, 277, 261, 242, 233, 227, 219, 200, 210, 222, 197, 167, 134, 141, 122, 117, 111, 110, 77, 7, 14, 19, 27, 37, 44], 'all_points_y': [97, 119, 148, 141, 111, 80, 62, 41, 39, 57, 60, 70, 87, 95, 94, 99, 111, 120, 139, 165, 185, 197, 219, 241, 224, 251, 244, 213, 221, 243, 260, 265, 267, 132, 117, 97, 93, 93, 98]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'cat.41.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.41.jpg', 'class_id': [], 'width': 333, 'height': 499, 'polygons': [{'name': 'polyline', 'all_points_x': [10, 3, 15, 53, 85, 156, 191, 225, 247, 269, 279, 269, 317, 330, 329, 275, 219, 119, 64, 4, 7], 'all_points_y': [241, 162, 113, 133, 185, 173, 175, 140, 120, 117, 135, 207, 210, 223, 454, 454, 451, 444, 447, 431, 238]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'cat.44.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.44.jpg', 'class_id': [], 'width': 107, 'height': 102, 'polygons': [{'name': 'polyline', 'all_points_x': [44, 38, 38, 38, 40, 47, 53, 59, 63, 78, 83, 80, 80, 80, 74, 73, 67, 66, 58, 46, 48, 48, 47, 43], 'all_points_y': [48, 39, 32, 27, 24, 28, 26, 22, 28, 23, 29, 40, 51, 71, 68, 60, 68, 79, 80, 83, 74, 68, 59, 48]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'dog.12498.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/dog.12498.jpg', 'class_id': [], 'width': 499, 'height': 377, 'polygons': [{'name': 'polyline', 'all_points_x': [190, 241, 250, 288, 341, 348, 354, 391, 409, 466, 493, 497, 451, 429, 408, 423, 387, 355, 321, 291, 274, 263, 260, 234, 206, 183, 183, 215], 'all_points_y': [93, 43, 13, 2, 5, 30, 43, 27, 21, 39, 47, 180, 181, 306, 242, 199, 207, 199, 196, 193, 193, 194, 217, 214, 211, 177, 135, 70]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'dog.12458.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/dog.12458.jpg', 'class_id': [], 'width': 185, 'height': 339, 'polygons': [{'name': 'polyline', 'all_points_x': [137, 136, 134, 125, 119, 128, 112, 96, 94, 88, 100, 112, 101, 74, 55, 47, 41, 32, 34, 26, 36, 50, 37, 28, 131, 143, 153, 151, 173, 156, 138, 138], 'all_points_y': [150, 208, 234, 259, 287, 315, 330, 334, 318, 310, 270, 228, 195, 189, 161, 160, 176, 176, 153, 127, 100, 78, 48, 0, 2, 42, 68, 92, 152, 152, 147, 152]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'cat.7.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.7.jpg', 'class_id': [], 'width': 495, 'height': 499, 'polygons': [{'name': 'polyline', 'all_points_x': [271, 270, 276, 292, 299, 310, 322, 334, 341, 349, 378, 390, 400, 415, 427, 442, 454, 464, 470, 475, 475, 474, 474, 478, 475, 472, 469, 467, 466, 463, 462, 459, 454, 454, 451, 448, 431, 436, 439, 398, 176, 172, 176, 182, 186, 214, 233, 256, 276, 293, 290, 281, 268, 271], 'all_points_y': [109, 60, 38, 27, 31, 40, 57, 69, 80, 84, 85, 82, 82, 55, 43, 28, 20, 17, 24, 43, 56, 66, 86, 98, 108, 115, 121, 126, 130, 135, 144, 157, 171, 180, 189, 195, 208, 228, 245, 433, 434, 272, 260, 248, 238, 200, 185, 173, 169, 170, 157, 132, 114, 106]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
Exception in thread Thread-2:
Traceback (most recent call last):
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 864, in run
self._target(self._args, *self._kwargs)
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/utils/data_utils.py", line 568, in data_generator_task
generator_output = next(self._generator)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1

Traceback (most recent call last):
File "catvsdog.py", line 376, in
train(model)
File "catvsdog.py", line 211, in train
layers='heads')
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 2375, in train
use_multiprocessing=False,
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 87, in wrapper
return func(args, *kwargs)
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/engine/training.py", line 2011, in fit_generator
generator_output = next(output_generator)
StopIteration

Just in case, if anyone who's getting IndexError stumbles upon this issue list:

As @zungam stated, it's because of polygon points outside the image boundaries. It's not always practical or more effort would be required to fix the masks manually. And, it can be handled in code to handle such exception (ideally, the polygons should not be overflowing image dimensions), with a bit of overhead which can be of little concern if training on huge dataset, one can try to fix in the following manner, this worked for me while training on custom dataset where fixing them manually was not an option for me:

  • print statements are added to figure out the problems (comment these later)
  • fix (overrides the actual array): rr[rr > mask.shape[0]-1] = mask.shape[0]-1 and cc[cc > mask.shape[1]-1] = mask.shape[1]-1
  • here's how the load_mask function looks like:
    def load_mask(self, image_id):
        """Generate instance masks for an image.
       Returns:
        masks: A bool array of shape [height, width, instance count] with
            one mask per instance.
        class_ids: a 1D array of class IDs of the instance masks.
        """
        # If not a road dataset image, delegate to parent class.
        image_info = self.image_info[image_id]
        if image_info["source"] != "road":
            return super(self.__class__, self).load_mask(image_id)

        # Convert polygons to a bitmap mask of shape
        # [height, width, instance_count]
        info = self.image_info[image_id]
        mask = np.zeros([info["height"], info["width"], len(info["polygons"])],
                        dtype=np.uint8)
        for i, p in enumerate(info["polygons"]):
            # Get indexes of pixels inside the polygon and set them to 1
            rr, cc = skimage.draw.polygon(p['all_points_y'], p['all_points_x'])
            print("mask.shape, min(mask),max(mask): {}, {},{}".format(mask.shape, np.min(mask),np.max(mask)))
            print("rr.shape, min(rr),max(rr): {}, {},{}".format(rr.shape, np.min(rr),np.max(rr)))
            print("cc.shape, min(cc),max(cc): {}, {},{}".format(cc.shape, np.min(cc),np.max(cc)))

            ## Note that this modifies the existing array arr, instead of creating a result array
            ## Ref: https://stackoverflow.com/questions/19666626/replace-all-elements-of-python-numpy-array-that-are-greater-than-some-value
            rr[rr > mask.shape[0]-1] = mask.shape[0]-1
            cc[cc > mask.shape[1]-1] = mask.shape[1]-1

            print("After fixing the dirt mask, new values:")        
            print("rr.shape, min(rr),max(rr): {}, {},{}".format(rr.shape, np.min(rr),np.max(rr)))
            print("cc.shape, min(cc),max(cc): {}, {},{}".format(cc.shape, np.min(cc),np.max(cc)))

            mask[rr, cc, i] = 1

        # Return mask, and array of class IDs of each instance. Since we have
        # one class ID only, we return an array of 1s
        return mask.astype(np.bool), np.ones([mask.shape[-1]], dtype=np.int32)

@abadcd hi, I have resolved this problem by add these lines in the top of utils.py. Hope it helps for you~

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

@mangalbhaskar Thanks! works great

I am trying to run Maskrcnn model on custom dataset.
Where input types is PNG for both images and masks.
NUM of classes - 1+2

`261/520 [==============>...............] - ETA: 3:07 - loss: 0.8962 - rpn_class_loss: 0.1704 - rpn_bbox_loss: 0.4765 - mrcnn_class_loss: 0.0078 - mrcnn_bbox_loss: 0.1968 - mrcnn_mask_loss: 0.0447

IndexError Traceback (most recent call last)
in
3 epochs=1,
4 augmentation=augmentation,
----> 5 layers='all')
6
7 # In[ ]:

/raid/Caries_Chicago/Mask_RCNN/samples/breast-cancer/../../mrcnn/model.py in train(self, train_dataset, val_dataset, learning_rate, epochs, layers, augmentation, custom_callbacks, no_augmentation_sources)
2382 max_queue_size=100,
2383 workers=workers,
-> 2384 use_multiprocessing=True,
2385 )
2386 self.epoch = max(self.epoch, epochs)

/usr/local/lib/python3.5/dist-packages/keras/legacy/interfaces.py in wrapper(args, *kwargs)
89 warnings.warn('Update your ' + object_name + 90 ' call to the Keras 2 API: ' + signature, stacklevel=2)
---> 91 return func(args, *kwargs)
92 wrapper._original_function = func
93 return wrapper

/usr/local/lib/python3.5/dist-packages/keras/engine/training.py in fit_generator(self, generator, steps_per_epoch, epochs, verbose, callbacks, validation_data, validation_steps, class_weight, max_queue_size, workers, use_multiprocessing, shuffle, initial_epoch)
2143 batch_index = 0
2144 while steps_done < steps_per_epoch:
-> 2145 generator_output = next(output_generator)
2146
2147 if not hasattr(generator_output, '__len__'):

/usr/local/lib/python3.5/dist-packages/keras/utils/data_utils.py in get(self)
768 success, value = self.queue.get()
769 if not success:
--> 770 six.reraise(value.__class__, value, value.__traceback__)

/usr/local/lib/python3.5/dist-packages/six.py in reraise(tp, value, tb)
691 if value.__traceback__ is not tb:
692 raise value.with_traceback(tb)
--> 693 raise value
694 finally:
695 value = None

IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1`
Can someone help me fix this?

hey, I have encountered this error while training with my own data of multiple classes. please help me to figure it out.

Epoch 1/30
ERROR:root:Error processing image {'id': 'IMG_20191112_104147.jpg', 'source': 'traffic', 'path': '../../datasets/traffic/val/IMG_20191112_104147.jpg', 'width': 4000, 'height': 3000, 'polygons': [{'name': 'polygon', 'all_points_x': [1024, 1019, 1714, 1689, 1575, 1472, 1189, 1081, 911, 1024], 'all_points_y': [2636, 2646, 2456, 2090, 2008, 1822, 1956, 2106, 2564, 2636]}, {'name': 'polygon', 'all_points_x': [2734, 3017, 3161, 3135, 3027, 2785, 2692, 2764, 2734], 'all_points_y': [2095, 2121, 2075, 1915, 1833, 1838, 1987, 2100, 2095]}, {'name': 'polygon', 'all_points_x': [2270, 2394, 2394, 2270, 2250, 2270], 'all_points_y': [1997, 1997, 1874, 1869, 1956, 1997]}], 'class_ids': [4, 3, 4]}
Traceback (most recent call last):
File "/content/drive/My Drive/Colab Notebooks/mj_pro_mid/mrcnn/model.py", line 1705, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/content/drive/My Drive/Colab Notebooks/mj_pro_mid/mrcnn/model.py", line 1261, in load_image_gt
class_ids = class_ids[_idx]
TypeError: only integer scalar arrays can be converted to a scalar index
ERROR:root:Error processing image {'id': 'IMG_20191112_104159.jpg', 'source': 'traffic', 'path': '../../datasets/traffic/train/IMG_20191112_104159.jpg'

Hello, can you tell me how do you fixed your error. Thanks

/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py:112: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. "
Epoch 1/30
Exception in thread Thread-3:
Traceback (most recent call last):
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 864, in run
self._target(self._args, *self._kwargs)
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/utils/data_utils.py", line 568, in data_generator_task
generator_output = next(self._generator)
ValueError: generator already executing

[]
ERROR:root:Error processing image {'id': 'cat.28.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.28.jpg', 'class_id': [], 'width': 286, 'height': 270, 'polygons': [{'name': 'polyline', 'all_points_x': [42, 43, 57, 69, 73, 74, 88, 99, 115, 134, 174, 199, 221, 231, 264, 277, 261, 242, 233, 227, 219, 200, 210, 222, 197, 167, 134, 141, 122, 117, 111, 110, 77, 7, 14, 19, 27, 37, 44], 'all_points_y': [97, 119, 148, 141, 111, 80, 62, 41, 39, 57, 60, 70, 87, 95, 94, 99, 111, 120, 139, 165, 185, 197, 219, 241, 224, 251, 244, 213, 221, 243, 260, 265, 267, 132, 117, 97, 93, 93, 98]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'cat.41.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.41.jpg', 'class_id': [], 'width': 333, 'height': 499, 'polygons': [{'name': 'polyline', 'all_points_x': [10, 3, 15, 53, 85, 156, 191, 225, 247, 269, 279, 269, 317, 330, 329, 275, 219, 119, 64, 4, 7], 'all_points_y': [241, 162, 113, 133, 185, 173, 175, 140, 120, 117, 135, 207, 210, 223, 454, 454, 451, 444, 447, 431, 238]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'cat.44.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.44.jpg', 'class_id': [], 'width': 107, 'height': 102, 'polygons': [{'name': 'polyline', 'all_points_x': [44, 38, 38, 38, 40, 47, 53, 59, 63, 78, 83, 80, 80, 80, 74, 73, 67, 66, 58, 46, 48, 48, 47, 43], 'all_points_y': [48, 39, 32, 27, 24, 28, 26, 22, 28, 23, 29, 40, 51, 71, 68, 60, 68, 79, 80, 83, 74, 68, 59, 48]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'dog.12498.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/dog.12498.jpg', 'class_id': [], 'width': 499, 'height': 377, 'polygons': [{'name': 'polyline', 'all_points_x': [190, 241, 250, 288, 341, 348, 354, 391, 409, 466, 493, 497, 451, 429, 408, 423, 387, 355, 321, 291, 274, 263, 260, 234, 206, 183, 183, 215], 'all_points_y': [93, 43, 13, 2, 5, 30, 43, 27, 21, 39, 47, 180, 181, 306, 242, 199, 207, 199, 196, 193, 193, 194, 217, 214, 211, 177, 135, 70]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'dog.12458.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/dog.12458.jpg', 'class_id': [], 'width': 185, 'height': 339, 'polygons': [{'name': 'polyline', 'all_points_x': [137, 136, 134, 125, 119, 128, 112, 96, 94, 88, 100, 112, 101, 74, 55, 47, 41, 32, 34, 26, 36, 50, 37, 28, 131, 143, 153, 151, 173, 156, 138, 138], 'all_points_y': [150, 208, 234, 259, 287, 315, 330, 334, 318, 310, 270, 228, 195, 189, 161, 160, 176, 176, 153, 127, 100, 78, 48, 0, 2, 42, 68, 92, 152, 152, 147, 152]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'cat.7.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.7.jpg', 'class_id': [], 'width': 495, 'height': 499, 'polygons': [{'name': 'polyline', 'all_points_x': [271, 270, 276, 292, 299, 310, 322, 334, 341, 349, 378, 390, 400, 415, 427, 442, 454, 464, 470, 475, 475, 474, 474, 478, 475, 472, 469, 467, 466, 463, 462, 459, 454, 454, 451, 448, 431, 436, 439, 398, 176, 172, 176, 182, 186, 214, 233, 256, 276, 293, 290, 281, 268, 271], 'all_points_y': [109, 60, 38, 27, 31, 40, 57, 69, 80, 84, 85, 82, 82, 55, 43, 28, 20, 17, 24, 43, 56, 66, 86, 98, 108, 115, 121, 126, 130, 135, 144, 157, 171, 180, 189, 195, 208, 228, 245, 433, 434, 272, 260, 248, 238, 200, 185, 173, 169, 170, 157, 132, 114, 106]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
Exception in thread Thread-2:
Traceback (most recent call last):
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 864, in run
self._target(self._args, *self._kwargs)
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/utils/data_utils.py", line 568, in data_generator_task
generator_output = next(self._generator)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1

Traceback (most recent call last):
File "catvsdog.py", line 376, in
train(model)
File "catvsdog.py", line 211, in train
layers='heads')
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 2375, in train
use_multiprocessing=False,
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 87, in wrapper
return func(args, *kwargs)
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/engine/training.py", line 2011, in fit_generator
generator_output = next(output_generator)
StopIteration

WARNING:tensorflow:From C:\Users18964699\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py:158: The name tf.get_default_session is deprecated. Please use tf.compat.v1.get_default_session instead.

WARNING:tensorflow:From C:\Users18964699\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py:163: The name tf.ConfigProto is deprecated. Please use tf.compat.v1.ConfigProto instead.

WARNING:tensorflow:From C:\Users18964699\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py:168: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead.

WARNING:tensorflow:From C:\Users18964699\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py:172: The name tf.global_variables is deprecated. Please use tf.compat.v1.global_variables instead.

WARNING:tensorflow:From C:\Users18964699\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py:181: The name tf.is_variable_initialized is deprecated. Please use tf.compat.v1.is_variable_initialized instead.

WARNING:tensorflow:From C:\Users18964699\AppData\Local\Continuum\anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py:188: The name tf.variables_initializer is deprecated. Please use tf.compat.v1.variables_initializer instead.

Starting at epoch 0. LR=0.001

Checkpoint Path: C:\Users18964699\Desktop\Mask_RCNN-master\logsshapes20200102T1646mask_rcnn_shapes_{epoch:04d}.h5
Selecting layers to train
fpn_c5p5 (Conv2D)
fpn_c4p4 (Conv2D)
fpn_c3p3 (Conv2D)
fpn_c2p2 (Conv2D)
fpn_p5 (Conv2D)
fpn_p2 (Conv2D)
fpn_p3 (Conv2D)
fpn_p4 (Conv2D)
In model: rpn_model
rpn_conv_shared (Conv2D)
rpn_class_raw (Conv2D)
rpn_bbox_pred (Conv2D)
mrcnn_mask_conv1 (TimeDistributed)
mrcnn_mask_bn1 (TimeDistributed)
mrcnn_mask_conv2 (TimeDistributed)
mrcnn_mask_bn2 (TimeDistributed)
mrcnn_class_conv1 (TimeDistributed)
mrcnn_class_bn1 (TimeDistributed)
mrcnn_mask_conv3 (TimeDistributed)
mrcnn_mask_bn3 (TimeDistributed)
mrcnn_class_conv2 (TimeDistributed)
mrcnn_class_bn2 (TimeDistributed)
mrcnn_mask_conv4 (TimeDistributed)
mrcnn_mask_bn4 (TimeDistributed)
mrcnn_bbox_fc (TimeDistributed)
mrcnn_mask_deconv (TimeDistributed)
mrcnn_class_logits (TimeDistributed)
mrcnn_mask (TimeDistributed)

Starting at epoch 10. LR=0.0001

Checkpoint Path: C:\Users18964699\Desktop\Mask_RCNN-master\logsshapes20200102T1646mask_rcnn_shapes_{epoch:04d}.h5
Selecting layers to train
conv1 (Conv2D)
bn_conv1 (BatchNorm)
res2a_branch2a (Conv2D)
bn2a_branch2a (BatchNorm)
res2a_branch2b (Conv2D)
bn2a_branch2b (BatchNorm)
res2a_branch2c (Conv2D)
res2a_branch1 (Conv2D)
bn2a_branch2c (BatchNorm)
bn2a_branch1 (BatchNorm)
res2b_branch2a (Conv2D)
bn2b_branch2a (BatchNorm)
res2b_branch2b (Conv2D)
bn2b_branch2b (BatchNorm)
res2b_branch2c (Conv2D)
bn2b_branch2c (BatchNorm)
res2c_branch2a (Conv2D)
bn2c_branch2a (BatchNorm)
res2c_branch2b (Conv2D)
bn2c_branch2b (BatchNorm)
res2c_branch2c (Conv2D)
bn2c_branch2c (BatchNorm)
res3a_branch2a (Conv2D)
bn3a_branch2a (BatchNorm)
res3a_branch2b (Conv2D)
bn3a_branch2b (BatchNorm)
res3a_branch2c (Conv2D)
res3a_branch1 (Conv2D)
bn3a_branch2c (BatchNorm)
bn3a_branch1 (BatchNorm)
res3b_branch2a (Conv2D)
bn3b_branch2a (BatchNorm)
res3b_branch2b (Conv2D)
bn3b_branch2b (BatchNorm)
res3b_branch2c (Conv2D)
bn3b_branch2c (BatchNorm)
res3c_branch2a (Conv2D)
bn3c_branch2a (BatchNorm)
res3c_branch2b (Conv2D)
bn3c_branch2b (BatchNorm)
res3c_branch2c (Conv2D)
bn3c_branch2c (BatchNorm)
res3d_branch2a (Conv2D)
bn3d_branch2a (BatchNorm)
res3d_branch2b (Conv2D)
bn3d_branch2b (BatchNorm)
res3d_branch2c (Conv2D)
bn3d_branch2c (BatchNorm)
res4a_branch2a (Conv2D)
bn4a_branch2a (BatchNorm)
res4a_branch2b (Conv2D)
bn4a_branch2b (BatchNorm)
res4a_branch2c (Conv2D)
res4a_branch1 (Conv2D)
bn4a_branch2c (BatchNorm)
bn4a_branch1 (BatchNorm)
res4b_branch2a (Conv2D)
bn4b_branch2a (BatchNorm)
res4b_branch2b (Conv2D)
bn4b_branch2b (BatchNorm)
res4b_branch2c (Conv2D)
bn4b_branch2c (BatchNorm)
res4c_branch2a (Conv2D)
bn4c_branch2a (BatchNorm)
res4c_branch2b (Conv2D)
bn4c_branch2b (BatchNorm)
res4c_branch2c (Conv2D)
bn4c_branch2c (BatchNorm)
res4d_branch2a (Conv2D)
bn4d_branch2a (BatchNorm)
res4d_branch2b (Conv2D)
bn4d_branch2b (BatchNorm)
res4d_branch2c (Conv2D)
bn4d_branch2c (BatchNorm)
res4e_branch2a (Conv2D)
bn4e_branch2a (BatchNorm)
res4e_branch2b (Conv2D)
bn4e_branch2b (BatchNorm)
res4e_branch2c (Conv2D)
bn4e_branch2c (BatchNorm)
res4f_branch2a (Conv2D)
bn4f_branch2a (BatchNorm)
res4f_branch2b (Conv2D)
bn4f_branch2b (BatchNorm)
res4f_branch2c (Conv2D)
bn4f_branch2c (BatchNorm)
res4g_branch2a (Conv2D)
bn4g_branch2a (BatchNorm)
res4g_branch2b (Conv2D)
bn4g_branch2b (BatchNorm)
res4g_branch2c (Conv2D)
bn4g_branch2c (BatchNorm)
res4h_branch2a (Conv2D)
bn4h_branch2a (BatchNorm)
res4h_branch2b (Conv2D)
bn4h_branch2b (BatchNorm)
res4h_branch2c (Conv2D)
bn4h_branch2c (BatchNorm)
res4i_branch2a (Conv2D)
bn4i_branch2a (BatchNorm)
res4i_branch2b (Conv2D)
bn4i_branch2b (BatchNorm)
res4i_branch2c (Conv2D)
bn4i_branch2c (BatchNorm)
res4j_branch2a (Conv2D)
bn4j_branch2a (BatchNorm)
res4j_branch2b (Conv2D)
bn4j_branch2b (BatchNorm)
res4j_branch2c (Conv2D)
bn4j_branch2c (BatchNorm)
res4k_branch2a (Conv2D)
bn4k_branch2a (BatchNorm)
res4k_branch2b (Conv2D)
bn4k_branch2b (BatchNorm)
res4k_branch2c (Conv2D)
bn4k_branch2c (BatchNorm)
res4l_branch2a (Conv2D)
bn4l_branch2a (BatchNorm)
res4l_branch2b (Conv2D)
bn4l_branch2b (BatchNorm)
res4l_branch2c (Conv2D)
bn4l_branch2c (BatchNorm)
res4m_branch2a (Conv2D)
bn4m_branch2a (BatchNorm)
res4m_branch2b (Conv2D)
bn4m_branch2b (BatchNorm)
res4m_branch2c (Conv2D)
bn4m_branch2c (BatchNorm)
res4n_branch2a (Conv2D)
bn4n_branch2a (BatchNorm)
res4n_branch2b (Conv2D)
bn4n_branch2b (BatchNorm)
res4n_branch2c (Conv2D)
bn4n_branch2c (BatchNorm)
res4o_branch2a (Conv2D)
bn4o_branch2a (BatchNorm)
res4o_branch2b (Conv2D)
bn4o_branch2b (BatchNorm)
res4o_branch2c (Conv2D)
bn4o_branch2c (BatchNorm)
res4p_branch2a (Conv2D)
bn4p_branch2a (BatchNorm)
res4p_branch2b (Conv2D)
bn4p_branch2b (BatchNorm)
res4p_branch2c (Conv2D)
bn4p_branch2c (BatchNorm)
res4q_branch2a (Conv2D)
bn4q_branch2a (BatchNorm)
res4q_branch2b (Conv2D)
bn4q_branch2b (BatchNorm)
res4q_branch2c (Conv2D)
bn4q_branch2c (BatchNorm)
res4r_branch2a (Conv2D)
bn4r_branch2a (BatchNorm)
res4r_branch2b (Conv2D)
bn4r_branch2b (BatchNorm)
res4r_branch2c (Conv2D)
bn4r_branch2c (BatchNorm)
res4s_branch2a (Conv2D)
bn4s_branch2a (BatchNorm)
res4s_branch2b (Conv2D)
bn4s_branch2b (BatchNorm)
res4s_branch2c (Conv2D)
bn4s_branch2c (BatchNorm)
res4t_branch2a (Conv2D)
bn4t_branch2a (BatchNorm)
res4t_branch2b (Conv2D)
bn4t_branch2b (BatchNorm)
res4t_branch2c (Conv2D)
bn4t_branch2c (BatchNorm)
res4u_branch2a (Conv2D)
bn4u_branch2a (BatchNorm)
res4u_branch2b (Conv2D)
bn4u_branch2b (BatchNorm)
res4u_branch2c (Conv2D)
bn4u_branch2c (BatchNorm)
res4v_branch2a (Conv2D)
bn4v_branch2a (BatchNorm)
res4v_branch2b (Conv2D)
bn4v_branch2b (BatchNorm)
res4v_branch2c (Conv2D)
bn4v_branch2c (BatchNorm)
res4w_branch2a (Conv2D)
bn4w_branch2a (BatchNorm)
res4w_branch2b (Conv2D)
bn4w_branch2b (BatchNorm)
res4w_branch2c (Conv2D)
bn4w_branch2c (BatchNorm)
res5a_branch2a (Conv2D)
bn5a_branch2a (BatchNorm)
res5a_branch2b (Conv2D)
bn5a_branch2b (BatchNorm)
res5a_branch2c (Conv2D)
res5a_branch1 (Conv2D)
bn5a_branch2c (BatchNorm)
bn5a_branch1 (BatchNorm)
res5b_branch2a (Conv2D)
bn5b_branch2a (BatchNorm)
res5b_branch2b (Conv2D)
bn5b_branch2b (BatchNorm)
res5b_branch2c (Conv2D)
bn5b_branch2c (BatchNorm)
res5c_branch2a (Conv2D)
bn5c_branch2a (BatchNorm)
res5c_branch2b (Conv2D)
bn5c_branch2b (BatchNorm)
res5c_branch2c (Conv2D)
bn5c_branch2c (BatchNorm)
fpn_c5p5 (Conv2D)
fpn_c4p4 (Conv2D)
fpn_c3p3 (Conv2D)
fpn_c2p2 (Conv2D)
fpn_p5 (Conv2D)
fpn_p2 (Conv2D)
fpn_p3 (Conv2D)
fpn_p4 (Conv2D)
In model: rpn_model
rpn_conv_shared (Conv2D)
rpn_class_raw (Conv2D)
rpn_bbox_pred (Conv2D)
mrcnn_mask_conv1 (TimeDistributed)
mrcnn_mask_bn1 (TimeDistributed)
mrcnn_mask_conv2 (TimeDistributed)
mrcnn_mask_bn2 (TimeDistributed)
mrcnn_class_conv1 (TimeDistributed)
mrcnn_class_bn1 (TimeDistributed)
mrcnn_mask_conv3 (TimeDistributed)
mrcnn_mask_bn3 (TimeDistributed)
mrcnn_class_conv2 (TimeDistributed)
mrcnn_class_bn2 (TimeDistributed)
mrcnn_mask_conv4 (TimeDistributed)
mrcnn_mask_bn4 (TimeDistributed)
mrcnn_bbox_fc (TimeDistributed)
mrcnn_mask_deconv (TimeDistributed)
mrcnn_class_logits (TimeDistributed)
mrcnn_mask (TimeDistributed)
​
anyone help please....It stuck here...

@mangalbhaskar I am not sure this solves 100% of the problem: what happens if there is a rotation of the annotations that remains within the image size?

in that case, the proposed mechanism would not pick it up I think.

I notice that VGG uses a different method to define / load an image's H, W in respect of mask.shape[0,1]. see here: https://gitlab.com/vgg/via/issues/145

A solution would have to make sure those are the same.

@noamnav Yes you have a point and it's pretty interesting case.

Also, a dangerous situation if two different tools VGG VIA and python code load an image in different orientation. In case where image orientation in python is different then that in the VGG VIA tool or any other tool due to the EXIF metadata, this is the dangerous situation to be in, because every image viewer or loader ideally should respect image metadata by convention during the loading; otherwise there's no reliability on the image orientation.

Thanks for raising this, It would be good help to the community if you can test this case and provide with better fix that will work in all conditions.

As a quick and dirty solution, a simple comparison testing an image's dimensions using two different image load tools enables us to identify which images are problematic:

from PIL import Image, ImageDraw
import skimage.draw

def get_image_dimensions(filepath):
    """
    extract image dimensions (in pixels)
    note some of the pictures return a different value when tested using PIL vs SKimage (these relate to each image's EXIF value
    see: https://gitlab.com/vgg/via/issues/145)

    """

    width_image, height_image = Image.open(filepath).size
    image = skimage.io.imread(filepath)
    height_skimage, width_skimage = image.shape[:2]

    if (width_image != width_skimage):
        print ('image rotation problem')
        return (0,0)

    return width_image, height_image

From here a basic filtering will do the trick. The complete solution may be a geometric one:

if (problem raised):
     rotate image
     translate all coordinates as per rotation

Will publish a full solution in this page once written.

@noamnav can you confirm your findings if that problematic image is loaded using this code, it has H, W interchanged? Also, if you can add opencv to your test case, it would be insightful as cv2, skimage, PIL are very commonly used. Which one of these libs respect exif metadata based orientation?

@mangalbhaskar , yes I can confirm that using the code below PIL will provide us with interchnaged H,W values for images with 'corrupt' / problematic EXIF. For 'good' images all three inspection methods (PIL, SKimage and, to your request, CV2) will give the same result.

It appears that SKimage and OpenCV consistantly give the same result whilst PIL provides us with the interchanged values:

from PIL import Image, ImageDraw
import skimage.draw
import cv2

def get_image_dimensions(filepath):
    """
    extract image dimensions (in pixels)
    note some of the pictures return a diffrent value when tested using PIL vs SKimage & openCV
    these relate to each image's EXIF value
    see: https://gitlab.com/vgg/via/issues/145
    for now those are simply skipped.
    """
    cv2_image = cv2.imread(filepath)
    height_cv2, width_cv2, channels = cv2_image.shape
    width_image, height_image = Image.open(filepath).size
    image = skimage.io.imread(filepath)
    height_skimage, width_skimage = image.shape[:2]

    print ('PIL image width: ', width_image, ' image height: ', height_image)
    print ('SKimage width: ', width_skimage, ' skimage height: ', height_skimage)
    print ('openCV width: ', width_cv2, ' skimage height: ', height_cv2)

    if ((width_image != width_skimage) or (width_image != width_cv2)):
        print ('image rotation problem')
        return (0,0)

    return width_image, height_image

until we have a more elegant solution for identifying these images this is actually an advantage as it implies we can easily locate and remove problem images.

In the following examples, the first two images would raise an error. The 3rd image is good.

8678

65045

7

hey, I have encountered this error while training with my own data of multiple classes. please help me to figure it out.

Epoch 1/30
ERROR:root:Error processing image {'id': 'IMG_20191112_104147.jpg', 'source': 'traffic', 'path': '../../datasets/traffic/val/IMG_20191112_104147.jpg', 'width': 4000, 'height': 3000, 'polygons': [{'name': 'polygon', 'all_points_x': [1024, 1019, 1714, 1689, 1575, 1472, 1189, 1081, 911, 1024], 'all_points_y': [2636, 2646, 2456, 2090, 2008, 1822, 1956, 2106, 2564, 2636]}, {'name': 'polygon', 'all_points_x': [2734, 3017, 3161, 3135, 3027, 2785, 2692, 2764, 2734], 'all_points_y': [2095, 2121, 2075, 1915, 1833, 1838, 1987, 2100, 2095]}, {'name': 'polygon', 'all_points_x': [2270, 2394, 2394, 2270, 2250, 2270], 'all_points_y': [1997, 1997, 1874, 1869, 1956, 1997]}], 'class_ids': [4, 3, 4]}
Traceback (most recent call last):
File "/content/drive/My Drive/Colab Notebooks/mj_pro_mid/mrcnn/model.py", line 1705, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/content/drive/My Drive/Colab Notebooks/mj_pro_mid/mrcnn/model.py", line 1261, in load_image_gt
class_ids = class_ids[_idx]
TypeError: only integer scalar arrays can be converted to a scalar index
ERROR:root:Error processing image {'id': 'IMG_20191112_104159.jpg', 'source': 'traffic', 'path': '../../datasets/traffic/train/IMG_20191112_104159.jpg'

Hello, can you tell me how do you fixed your error. Thanks
/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py:112: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. "
Epoch 1/30
Exception in thread Thread-3:
Traceback (most recent call last):
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 864, in run
self._target(self._args, *self._kwargs)
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/utils/data_utils.py", line 568, in data_generator_task
generator_output = next(self._generator)
ValueError: generator already executing
[]
ERROR:root:Error processing image {'id': 'cat.28.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.28.jpg', 'class_id': [], 'width': 286, 'height': 270, 'polygons': [{'name': 'polyline', 'all_points_x': [42, 43, 57, 69, 73, 74, 88, 99, 115, 134, 174, 199, 221, 231, 264, 277, 261, 242, 233, 227, 219, 200, 210, 222, 197, 167, 134, 141, 122, 117, 111, 110, 77, 7, 14, 19, 27, 37, 44], 'all_points_y': [97, 119, 148, 141, 111, 80, 62, 41, 39, 57, 60, 70, 87, 95, 94, 99, 111, 120, 139, 165, 185, 197, 219, 241, 224, 251, 244, 213, 221, 243, 260, 265, 267, 132, 117, 97, 93, 93, 98]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'cat.41.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.41.jpg', 'class_id': [], 'width': 333, 'height': 499, 'polygons': [{'name': 'polyline', 'all_points_x': [10, 3, 15, 53, 85, 156, 191, 225, 247, 269, 279, 269, 317, 330, 329, 275, 219, 119, 64, 4, 7], 'all_points_y': [241, 162, 113, 133, 185, 173, 175, 140, 120, 117, 135, 207, 210, 223, 454, 454, 451, 444, 447, 431, 238]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'cat.44.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.44.jpg', 'class_id': [], 'width': 107, 'height': 102, 'polygons': [{'name': 'polyline', 'all_points_x': [44, 38, 38, 38, 40, 47, 53, 59, 63, 78, 83, 80, 80, 80, 74, 73, 67, 66, 58, 46, 48, 48, 47, 43], 'all_points_y': [48, 39, 32, 27, 24, 28, 26, 22, 28, 23, 29, 40, 51, 71, 68, 60, 68, 79, 80, 83, 74, 68, 59, 48]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'dog.12498.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/dog.12498.jpg', 'class_id': [], 'width': 499, 'height': 377, 'polygons': [{'name': 'polyline', 'all_points_x': [190, 241, 250, 288, 341, 348, 354, 391, 409, 466, 493, 497, 451, 429, 408, 423, 387, 355, 321, 291, 274, 263, 260, 234, 206, 183, 183, 215], 'all_points_y': [93, 43, 13, 2, 5, 30, 43, 27, 21, 39, 47, 180, 181, 306, 242, 199, 207, 199, 196, 193, 193, 194, 217, 214, 211, 177, 135, 70]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'dog.12458.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/dog.12458.jpg', 'class_id': [], 'width': 185, 'height': 339, 'polygons': [{'name': 'polyline', 'all_points_x': [137, 136, 134, 125, 119, 128, 112, 96, 94, 88, 100, 112, 101, 74, 55, 47, 41, 32, 34, 26, 36, 50, 37, 28, 131, 143, 153, 151, 173, 156, 138, 138], 'all_points_y': [150, 208, 234, 259, 287, 315, 330, 334, 318, 310, 270, 228, 195, 189, 161, 160, 176, 176, 153, 127, 100, 78, 48, 0, 2, 42, 68, 92, 152, 152, 147, 152]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
[]
ERROR:root:Error processing image {'id': 'cat.7.jpg', 'source': 'catvsdog', 'path': '/home/taosw/Mask_RCNN/samples/catvsdog/train/cat.7.jpg', 'class_id': [], 'width': 495, 'height': 499, 'polygons': [{'name': 'polyline', 'all_points_x': [271, 270, 276, 292, 299, 310, 322, 334, 341, 349, 378, 390, 400, 415, 427, 442, 454, 464, 470, 475, 475, 474, 474, 478, 475, 472, 469, 467, 466, 463, 462, 459, 454, 454, 451, 448, 431, 436, 439, 398, 176, 172, 176, 182, 186, 214, 233, 256, 276, 293, 290, 281, 268, 271], 'all_points_y': [109, 60, 38, 27, 31, 40, 57, 69, 80, 84, 85, 82, 82, 55, 43, 28, 20, 17, 24, 43, 56, 66, 86, 98, 108, 115, 121, 126, 130, 135, 144, 157, 171, 180, 189, 195, 208, 228, 245, 433, 434, 272, 260, 248, 238, 200, 185, 173, 169, 170, 157, 132, 114, 106]}]}
Traceback (most recent call last):
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
Exception in thread Thread-2:
Traceback (most recent call last):
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/threading.py", line 864, in run
self._target(self._args, *self._kwargs)
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/utils/data_utils.py", line 568, in data_generator_task
generator_output = next(self._generator)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1710, in data_generator
use_mini_mask=config.USE_MINI_MASK)
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 1266, in load_image_gt
class_ids = class_ids[_idx]
IndexError: boolean index did not match indexed array along dimension 0; dimension is 0 but corresponding boolean dimension is 1
Traceback (most recent call last):
File "catvsdog.py", line 376, in
train(model)
File "catvsdog.py", line 211, in train
layers='heads')
File "/home/taosw/Mask_RCNN/mrcnn/model.py", line 2375, in train
use_multiprocessing=False,
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 87, in wrapper
return func(args, *kwargs)
File "/home/taosw/anaconda3/envs/tf/lib/python3.6/site-packages/keras/engine/training.py", line 2011, in fit_generator
generator_output = next(output_generator)
StopIteration

How you solved your errors. i m also getting same errors.

Was this page helpful?
0 / 5 - 0 ratings