Darknet: mAP at testing

Created on 28 Dec 2018  路  9Comments  路  Source: AlexeyAB/darknet

Hi, I have read the detector.c code and it seems that the mAP calculation when using ./darknet detector map ... calculates the old mAP metric (calculating the precision for recall values of 0.1, 0.2, 0.3, ...). I had some doubts:

1) In the YOLOv3's paper, the new mAP metric (from COCO) is shown as "AP" in Table 3, along with the old mAP metric, that is shown as AP50 and AP75. Are AP50 and AP75 the resulting values of using "./darknet detector map" with thresholds 0.50 and 0.75? How is the "AP" calculated? Is there an already implemented way of calculating it using some command?

2) For PR curve graph: this repository gives you the values of precision for every recall point for every class, if I want to do the overall PR curve, do I take the mean of every class in every recall point?

Thanks!

question

Most helpful comment

@gnoya Hi,

  1. Yes, ./darknet detector map .... by default calculates mAP@IoU=0.50
    If you want to calculate mAP@IoU=0.75 then you should use ./darknet detector map ... -iou_thresh 0.75
    mAP@IoU=0.50 is calculated as average of APs for each class that are calculated for IoU-threshold=0.5 (50%). How mAP is calculated: https://medium.com/@jonathan_hui/map-mean-average-precision-for-object-detection-45c121a31173

  2. You can take the mean of every class in every recall point, but it will not the same as PR-curve that is built independently of the classes.


  • There is no old and new mAP. mAP is by default mAP@IoU=0.50

    • There is mAP for Pascal VOC and ImageNet (mAP@IoU=0.50 or simpy mAP)
    • There is AP@IoU=0.50 for MS COCO (the same as mAP for Pascal VOC and ImageNet)
    • There is AP@IoU=0.75 for MS COCO (or mAP@IoU=0.75)
    • There is AP@[.5, .95] for MS COCO (average of mAPs: AP@IoU=0.50, AP@IoU=0.55, ... AP@IoU=0.95)
  • mAP is used in Pascal VOC and ImageNet, and this is the same as AP@IoU=0.50 in MS COCO: http://homepages.inf.ed.ac.uk/ckiw/postscript/ijcv_voc09.pdf

But yes, authors of MSCOCO bring confusion: http://cocodataset.org/#detection-eval

  1. AP is averaged over all categories. Traditionally, this is called "mean average precision" (mAP). We make no distinction between AP and mAP (and likewise AR and mAR) and assume the difference is clear from context.

Also Jonathan Hui calls AP@[.5, .95] as mAP@[.5, .95]: https://medium.com/@jonathan_hui/object-detection-speed-and-accuracy-comparison-faster-r-cnn-r-fcn-ssd-and-yolo-5425656ae359

FPN and Faster R-CNN*(using ResNet as the feature extractor) have the highest accuracy (mAP@[.5:.95]).
...
If mAP is calculated with one single IoU only, use mAP@IoU=0.75.

All 9 comments

@gnoya Hi,

  1. Yes, ./darknet detector map .... by default calculates mAP@IoU=0.50
    If you want to calculate mAP@IoU=0.75 then you should use ./darknet detector map ... -iou_thresh 0.75
    mAP@IoU=0.50 is calculated as average of APs for each class that are calculated for IoU-threshold=0.5 (50%). How mAP is calculated: https://medium.com/@jonathan_hui/map-mean-average-precision-for-object-detection-45c121a31173

  2. You can take the mean of every class in every recall point, but it will not the same as PR-curve that is built independently of the classes.


  • There is no old and new mAP. mAP is by default mAP@IoU=0.50

    • There is mAP for Pascal VOC and ImageNet (mAP@IoU=0.50 or simpy mAP)
    • There is AP@IoU=0.50 for MS COCO (the same as mAP for Pascal VOC and ImageNet)
    • There is AP@IoU=0.75 for MS COCO (or mAP@IoU=0.75)
    • There is AP@[.5, .95] for MS COCO (average of mAPs: AP@IoU=0.50, AP@IoU=0.55, ... AP@IoU=0.95)
  • mAP is used in Pascal VOC and ImageNet, and this is the same as AP@IoU=0.50 in MS COCO: http://homepages.inf.ed.ac.uk/ckiw/postscript/ijcv_voc09.pdf

But yes, authors of MSCOCO bring confusion: http://cocodataset.org/#detection-eval

  1. AP is averaged over all categories. Traditionally, this is called "mean average precision" (mAP). We make no distinction between AP and mAP (and likewise AR and mAR) and assume the difference is clear from context.

Also Jonathan Hui calls AP@[.5, .95] as mAP@[.5, .95]: https://medium.com/@jonathan_hui/object-detection-speed-and-accuracy-comparison-faster-r-cnn-r-fcn-ssd-and-yolo-5425656ae359

FPN and Faster R-CNN*(using ResNet as the feature extractor) have the highest accuracy (mAP@[.5:.95]).
...
If mAP is calculated with one single IoU only, use mAP@IoU=0.75.

@AlexeyAB Thank you, is there a way to calculate AP@[.5, .95] with the current commit? If there is not, will it work if I change lines 938 and 939, so point will go from 0.5 to 0.95? Also change line 953 to divide into the new number of iterated points.

Thanks!

@gnoya

You should run several commands:
```
./darknet detector map obj.data yolo-obj.cfg yolo-obj.weights -iou_thresh 0.50
./darknet detector map obj.data yolo-obj.cfg yolo-obj.weights -iou_thresh 0.55
./darknet detector map obj.data yolo-obj.cfg yolo-obj.weights -iou_thresh 0.60
./darknet detector map obj.data yolo-obj.cfg yolo-obj.weights -iou_thresh 0.65
...
./darknet detector map obj.data yolo-obj.cfg yolo-obj.weights -iou_thresh 0.95
````

And then manually calculate average AP@[.5, .95] of these 10 mAPs.

@AlexeyAB Thanks! Last question, does "-thresh" (not -iou_thresh) parameter affects on AP calculation?

@gnoya No. -thresh doesn't affect on AP or mAP.

  • -thresh affects on: IoU, F1, TP/FP/FN, P/R - for the current Probability-threshold

  • -iou_thresh affects on APs and mAP

@AlexeyAB @gnoya I am working on Yolov3 object detection for medical image analysis. I want to plot the P-R curve for my output result. How can I produce the 11 point values for recall and precision? I am using AlexeyAB's repo

@Fetulhak Un-comment this line, rebuild Darknet and run mAP calculation: https://github.com/AlexeyAB/darknet/blob/f14054ec2b49440ad488c3e28612e7a76780bc5f/src/detector.c#L1277

@Fetulhak did you manage to plot the the P-R curve please? If you did do, could you please share with us your suggestion.

@Fetulhak did you manage to plot the the P-R curve please? If you did do, could you please share with us your suggestion.

@Emirismail Like Alexey said uncomment that print command and you will get the 11 point values for your evaluation dataset. Taking those 11 point precision values you can plot using matplotlib library simply by giving x and y data values for your plot curve. That is what I did to plot the P-R curve for my result analysis.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

PROGRAMMINGENGINEER-NIKI picture PROGRAMMINGENGINEER-NIKI  路  3Comments

zihaozhang9 picture zihaozhang9  路  3Comments

rezaabdullah picture rezaabdullah  路  3Comments

Jacky3213 picture Jacky3213  路  3Comments

HanSeYeong picture HanSeYeong  路  3Comments