Evaluating Performance

Intersection over union is a way to measure the overlap between the segmentations that the model is predicting against your labels you put in.

A well performing model would exactly predict the labels you put in, and the overlap would be complete.

IoU = area of overlap / area of union (the total area of both objects together, overlapping area is only counted once)

IoU of a well performing model should be 1, a poorly performing model would be 0.

IoU is calculated at different thresholds. If 50% of the area of a predicted object and the object you labeled overlap, at 0.5 that is counted as a match, but not at 0.95.

Average precision (AP) is how accurate the model is. What proportion of the predicted objects are a match to the labels you put in, at different thresholds, averaged over all the objects in that group.

Mean over IoU is averaging the IoU scores at different thresholds.

For objects, bbox is a bounding box tightly drawn around the object, wheras mask is the actual polygon you drew and that the AI predicted.

For region or semantic classes, the average precision is determined on a per pixel basis.

Last updated