In this letter, we predict the locations as probability distributions for the tasks of image object detection. We adopt the Kullback-Leibler divergences as the regression losses to train the deep neural networks. Since most existing evaluations label the objects with rectangular bounding boxes, we propose the Nearest Distribution Converter to find the closest uniform distributions from the predicted ones. Our proposed method can improve the detected accuracy measured in mAP by 0.57%, 0.75%, and 0.48% on the models YOLOv3, the YOLOv4-tiny, and the YOLOv4, respectively.
All Science Journal Classification (ASJC) codes
- Signal Processing
- Applied Mathematics
- Electrical and Electronic Engineering