Occlusion-Aware Fragment-based Tracking with Spatial-Temporal Consistency

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Authors Chong Sun, Dongming Wang, Huchuan Lu
Journal/Conference Name IEEE Transactions on Image Processing
Paper Category
Paper Abstract In this paper, we present a robust tracking method by exploiting a fragment-based appearance model with consideration of both temporal continuity and discontinuity information. From the perspective of probability theory, the proposed tracking algorithm can be viewed as a two-stage optimization problem. In the first stage, by adopting the estimated occlusion state as a prior, the optimal state of the tracked object can be obtained by solving an optimization problem, where the objective function is designed based on the classification score, occlusion prior, and temporal continuity information. In the second stage, we propose a discriminative occlusion model, which exploits both foreground and background information to detect the possible occlusion, and also models the consistency of occlusion labels among different frames. In addition, a simple yet effective training strategy is introduced during the model training (and updating) process, with which the effects of spatial-temporal consistency are properly weighted. The proposed tracker is evaluated by using the recent benchmark data set, on which the results demonstrate that our tracker performs favorably against other state-of-the-art tracking algorithms.
Date of publication 2016
Code Programming Language MATLAB
Comment

Copyright Researcher 2021