Distractor-aware deep regression for visual tracking

Date

2019-01-18

Advisors

Journal Title

Journal ISSN

Volume Title

Publisher

MDPI

Department

Type

Article

ISSN

1424-8220

item.page.extent-format

Citation

Ming Du, Yan Ding, Xiuyun Meng, et al., Distractor-aware deep regression for visual tracking. Sensors, 2019, Volume 19, Issue 2, Article number 387

Abstract

In recent years, regression trackers have drawn increasing attention in the visual-object tracking community due to their favorable performance and easy implementation. The tracker algorithms directly learn mapping from dense samples around the target object to Gaussian-like soft labels. However, in many real applications, when applied to test data, the extreme imbalanced distribution of training samples usually hinders the robustness and accuracy of regression trackers. In this paper, we propose a novel effective distractor-aware loss function to balance this issue by highlighting the significant domain and by severely penalizing the pure background. In addition, we introduce a full differentiable hierarchy-normalized concatenation connection to exploit abstractions across multiple convolutional layers. Extensive experiments were conducted on five challenging benchmark-tracking datasets, that is, OTB-13, OTB-15, TC-128, UAV-123, and VOT17. The experimental results are promising and show that the proposed tracker performs much better than nearly all the compared state-of-the-art approaches.

Description

item.page.description-software

item.page.type-software-language

item.page.identifier-giturl

Keywords

object tracking, deep-regression networks, data imbalance, distractor aware

Rights

Attribution 4.0 International

item.page.relationships

item.page.relationships

item.page.relation-supplements