Mask_rcnn: change the anchors ratios to more than tree elements,such as(0.2,0.3,1,3,5),

Created on 6 May 2019  路  1Comment  路  Source: matterport/Mask_RCNN

I train my own data set,change the RPN_RATIOS in config.py,my ratios has more than three elements,when I run the following code
`model = modellib.MaskRCNN(mode='training', config=config, model_dir=ROOT_DIR)

Exclude the last layers because they require a matching

number of classes

model.load_weights(COCO_MODEL_PATH, by_name=True, exclude=[
"mrcnn_class_logits", "mrcnn_bbox_fc",
"mrcnn_bbox", "mrcnn_mask"]) the error as follows: ValueError: Layer #359 (named "rpn_model"), weight has shape (1, 1, 512, 10), but the saved weight has shape (6, 512, 1, 1).`
does the coco preptrained weigths not match the new anchors in Layers(such as P2,P3,P3...)
I also remove the cache maded by training previous

Most helpful comment

@Emma-uestc Do not load RPN weights when you load pre-train model, you can modify training code like this:

model.load_weights(
        weights_path,
        by_name=True,
        exclude=[
            "mrcnn_class_logits", "mrcnn_bbox_fc", "mrcnn_bbox", "mrcnn_mask",
            "rpn_model"  # because anchor's ratio has been changed
        ])

>All comments

@Emma-uestc Do not load RPN weights when you load pre-train model, you can modify training code like this:

model.load_weights(
        weights_path,
        by_name=True,
        exclude=[
            "mrcnn_class_logits", "mrcnn_bbox_fc", "mrcnn_bbox", "mrcnn_mask",
            "rpn_model"  # because anchor's ratio has been changed
        ])
Was this page helpful?
0 / 5 - 0 ratings

Related issues

canerozer picture canerozer  路  3Comments

wjdhuster2018 picture wjdhuster2018  路  3Comments

chrispolo picture chrispolo  路  4Comments

taewookim picture taewookim  路  4Comments

techjjun picture techjjun  路  4Comments