TensorFlow Object Detection API: does mAP score behave strangely?

I am training an object detector for my own data using the Tensorflow Object Detection API. I follow the (big) Dat Tran tutorial https://towardsdatascience.com/how-to-train-your-own-object-detector-with-tensorflows-object-detector-api-bec72ecfe1d9 . I use the ssd_mobilenet_v1_coco model of the pre-prepared model checkpoint as a starting point for training. I have only one class of objects.

I exported the prepared model, launched it on the estimated data and looked at the given limiting fields. The trained model worked well; I would say that if there were 20 objects, then, as a rule, on the predicted bounding boxes there were 13 objects with spots ("true positive results"); 7 where no objects were detected ("false negatives"); 2 cases when problems arose, there were two or more objects close to each other: the bounding fields are extracted between the objects in some of these cases ("false positives" <- of course, causing these "false positives", etc., inaccurate, but it is only for me to understand the concept of accuracy here). There are almost no other “false positives”. This seems like a much better result thanwhat did I hope to get, and although this kind of visual control does not give the actual mAP (which is calculated based on the overlap of the predicted and labeled bounding boxes?), I would roughly rate mAP as something like 13 / (13 + 2)> 80%.

However, when I run assessment ( eval.py) (on two different sets of ratings), I get the following mAP graph (0.7 smooth): mAP during training

This would mean a huge variation in mAP and a level of about 0.3 at the end of training, which is much worse than I expected, based on how well the boundary fields are displayed when I use the exported output_inference_graph.pbon the evaluation set.

Here is the complete loss schedule for training: total loss during training

200 20 ( labelImg); , . - 1200x900, 600x450 . ( eval.py , ) , 50 20 , ( 30 30 ).

1: mAP , ? , mAP ? , ( : Tensorflow api - , ?)

2: (1200x700 600x450)? , ? ( , , , ).

3: , , ( , , , " "?)

( 4: , ​​ , 10000 timesteps MAP, , , .)

, , !:)

+4
1

1: ... -, , , mAP, . , , :

  • , , "True positive" "False positive"; , "True positive", "False Negative".

  • ( ) . (TP/(TP+FP)) (TP/(TP+FN)), ( , ) TP FP. (acc, recc), .

  • , , :

  • , mAP .

: - . acc/rec : 100% - (13/20), 13/20 <100%; mAP=AP(category 1)=13/20=0.65. , - , .

, :

  • , , , - , . , (IoU) ( ) 0.5. , ; script, , ( , FP, FN).

  • , 10 . , : 1. , , , , . 2. , , , , , , , , , , .

2:, mobilenet_v1_coco-model, ( , ) 300x300 , .

3: , , .

4: , , . 10k , , , , , , , ...

+2

Source: https://habr.com/ru/post/1690374/


All Articles