Tang X M, Chen Z G, Fu Y. Anti-occlusion and re-tracking of real-time moving target based on kernelized correlation filter[J]. Opto-Electron Eng, 2020, 47(1): 190279. doi: 10.12086/oee.2020.190279
Citation: Tang X M, Chen Z G, Fu Y. Anti-occlusion and re-tracking of real-time moving target based on kernelized correlation filter[J]. Opto-Electron Eng, 2020, 47(1): 190279. doi: 10.12086/oee.2020.190279

Anti-occlusion and re-tracking of real-time moving target based on kernelized correlation filter

    Fund Project: Supported by National Natural Science Foundation of China (61502203), Natural Science Foundation of Jiangsu Province (BK20150122), Natural Science Research Project of Jiangsu Higher Education Institutions (17KJB520039), and Scientific Research Project of "333 High-level Talent Cultivation Project" in Jiangsu Province (BRA2018147)
More Information
  • The correlation filtering algorithm determines the target position by the similarity between the template and the detection target. Since the related filtering concept is used for target tracking, it has been widely concerned, and the proposal of the kernelized correlation filter is to push this concept to a new height. The kernelized correlation filter has become a research hotspot with its high speed, high precision and high robustness. However, the kernelized correlation filter has serious defects in anti-blocking performance. In this paper, the algorithm for the anti-occlusion performance of kernelized correlation filter is improved. An improved KCF algorithm based on Sobel edge binary mode algorithm is proposed. The Sobel edge binary mode algorithm is used to weight the fusion target feature. The target's peak response intensity sidelobe value is more than the detection target is lost. Finally, the Kalman algorithm is used as the target occlusion strategy. The results show that the proposed method not only has better robustness against occlusion, but also satisfy the real-time requirements and can accurately re-tracks the target.
  • 加载中
  • [1] 吴小俊, 徐天阳, 须文波.基于相关滤波的视频目标跟踪算法综述[J].指挥信息系统与技术, 2017, 8(3): 1–5. doi: 10.15908/j.cnki.cist.2017.03.001

    CrossRef Google Scholar

    Wu X J, Xu T Y, Xu W B. Review of target tracking algorithms in video based on correlation filter[J]. Command Information System and Technology, 2017, 8(3): 1–5. doi: 10.15908/j.cnki.cist.2017.03.001

    CrossRef Google Scholar

    [2] Henriques J F, Caseiro R, Martins P, et al. Exploiting the circulant structure of tracking-by-detection with kernels[C]// Proceedings of the 12th European Conference on Computer Vision, 2012: 702–715.

    Google Scholar

    [3] Comaniciu D, Meer P. Mean shift: a robust approach toward feature space analysis[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2002, 24(5): 603–619. doi: 10.1109/34.1000236

    CrossRef Google Scholar

    [4] Mei X, Ling H B. Robust visual tracking and vehicle classification via sparse representation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33(11): 2259–2272. doi: 10.1109/TPAMI.2011.66

    CrossRef Google Scholar

    [5] Danelljan M, Häger G, Khan F S, et al. Accurate scale estimation for robust visual tracking[C]//Proceedings of the British Machine Vision Conference, 2014.

    Google Scholar

    [6] Grabner H, Grabner M, Bischof H. Real-time tracking via on-line boosting[C]//Proceedings of the British Machine Vision Conference, 2006, 1: 47–56.

    Google Scholar

    [7] 王暐, 王春平, 付强, 等.基于分块的尺度自适应CSK跟踪算法[J].电光与控制, 2017, 24(2): 25–29. doi: 10.3969/j.issn.1671-637X.2017.02.005

    CrossRef Google Scholar

    Wang W, Wang C P, Fu Q, et al. Patch-based scale adaptive CSK tracking method[J]. Electronics Optics & Control, 2017, 24(2): 25–29. doi: 10.3969/j.issn.1671-637X.2017.02.005

    CrossRef Google Scholar

    [8] Henriques J F, Caseiro R, Martins P, et al. High-speed tracking with kernelized correlation filters[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37(3): 583–596. doi: 10.1109/TPAMI.2014.2345390

    CrossRef Google Scholar

    [9] Danelljan M, Khan F S, Felsberg M, et al. Adaptive color attributes for real-time visual tracking[C]//Proceedings of 2014 IEEE Conference on Computer Vision and Pattern Recognition, 2014: 1090–1097.

    Google Scholar

    [10] Danelljan M, Häger G, Khan F S, et al. Learning spatially regularized correlation filters for visual tracking[C]//Proceedings of 2015 IEEE International Conference on Computer Vision, 2015: 4310–4318.

    Google Scholar

    [11] Bertinetto L, Valmadre J, Golodetz S, et al. Staple: complementary learners for real-time tracking[C]//Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition, 2016: 1401–1409.

    Google Scholar

    [12] Valmadre J, Bertinetto L, Henriques J, et al. End-to-end representation learning for correlation filter based tracking[C]//Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition, 2017: 5000–5008.

    Google Scholar

    [13] Danelljan M, Bhat G, Khan F S, et al. Eco: efficient convolution operators for tracking[C]//Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition, 2017: 6931–6939.

    Google Scholar

    [14] 包晓安, 詹秀娟, 王强, 等.基于KCF和SIFT特征的抗遮挡目标跟踪算法[J].计算机测量与控制, 2018, 26(5): 148–152. doi: 10.16526/j.cnki.11-4762/tp.2018.05.037

    CrossRef Google Scholar

    Bao X A, Zhan X J, Wang Q, et al. Anti occlusion target tracking algorithm based on KCF and SIFT feature[J]. Computer Measurement & Control, 2018, 26(5): 148–152. doi: 10.16526/j.cnki.11-4762/tp.2018.05.037

    CrossRef Google Scholar

    [15] 闫河, 张杨, 杨晓龙, 等.一种抗遮挡核相关滤波目标跟踪算法[J].光电子·激光, 2018, 29(6): 647–652. doi: 10.16136/j.joel.2018.06.0286

    CrossRef Google Scholar

    Yan H, Zhang Y, Yang X L, et al. A kernelized correlaton filters with occlusion handling[J]. Journal of Optoelectronics·Laser, 2018, 29(6): 647–652. doi: 10.16136/j.joel.2018.06.0286

    CrossRef Google Scholar

    [16] Li F, Tian C, Zuo W M, et al. Learning spatial-temporal regularized correlation filters for visual tracking[C]//Proceedings of 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2018: 4904–4913.

    Google Scholar

    [17] Li Y, Zhu J K, Hoi S C H, et al. Robust estimation of similarity transformation for visual object tracking[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2019, 33: 8666–8673.

    Google Scholar

    [18] Xu T Y, Feng Z H, Wu X J, et al. Learning adaptive discriminative correlation filters via temporal consistency preserving spatial feature selection for robust visual tracking[C]//Proceedings of 2019 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019.

    Google Scholar

    [19] Li B, Yan J J, Wu W, et al. High performance visual tracking with Siamese region proposal network[C]//Proceedings of 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2018: 8971–8980.

    Google Scholar

    [20] Li B, Wu W, Wang Q, et al. SiamRPN++: evolution of Siamese visual tracking with very deep networks[C]//Proceedings of 2018 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018.

    Google Scholar

    [21] 单倩文, 郑新波, 何小海, 等.基于改进多尺度特征图的目标快速检测与识别算法[J].激光与光电子学进展, 2019, 56(2): 55–62. doi: 10.3788/LOP56.021002

    CrossRef Google Scholar

    Shan Q W, Zheng X B, He X H, et al. Fast object detection and recognition algorithm based on improved multi-scale feature maps[J]. Laser & Optoelectronics Progress, 2019, 56(2): 55–62. doi: 10.3788/LOP56.021002

    CrossRef Google Scholar

    [22] Fan H, Lin L T, Yang F, et al. LaSOT: a high-quality benchmark for large-scale single object tracking[C]//Proceedings of 2019 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019.

    Google Scholar

    [23] Faragher R. Understanding the basis of the Kalman filter via a simple and intuitive derivation[lecture notes][J]. IEEE Signal Processing Magazine, 2012, 29(5): 128–132. doi: 10.1109/MSP.2012.2203621

    CrossRef Google Scholar

    [24] Bolme D S, Beveridge J R, Draper B A, et al. Visual object tracking using adaptive correlation filters[C]//Proceedings of 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2010: 2544–2550.

    Google Scholar

    [25] Li Y, Zhu J K. A scale adaptive kernel correlation filter tracker with feature integration[C]//Computer Vision-ECCV 2014 Workshops, 2014: 254–265.

    Google Scholar

  • Overview: Target tracking is a topic that has been discussed in depth in current academic community. Its application is quite broad, spanning monitoring, motion analysis, medical imaging, behavior recognition, monitoring and human-computer interaction. When the tracking target is occluded, the accuracy of the current algorithm is not high. Therefore, the research of target tracking algorithm is still an important topic in the field of computer vision. The kernelized correlation filter is one of the most effective methods in the target tracking algorithm. It has become a research hotspot with its high speed, high precision and high robustness. More and more experts and scholars are committed to optimizing the existing features, so that the improved algorithm can achieve good experimental results. The kernelized correlation filter mainly uses the histogram of oriented gradient (HOG) in feature extraction, and determines the target position by the similarity between the template and the detection target. However, the inherent nature of the gradient makes the histogram of oriented gradient of the target very sensitive to noise and the target cannot be tracked by using this algorithm when the target is occluded. In order to overcome these shortcomings of the algorithm, this paper proposes an improved kernelized correlation filter that combines the Sobel edge binary mode algorithm. Firstly, the Sobel edge binary mode algorithm and the histogram of oriented gradient are used to weight the fusion target feature, and the HOG edge detection is enhanced for the target feature, which makes the tracking target information more obvious. Secondly, in order to make the Kalman prediction algorithm can accurately judge the target after it is occluded, the target position obtained by the kernelized correlation filter in the unoccluded tracking process is continuously merged with the target position obtained by the Kalman algorithm. Finally, the target's peak response intensity sidelobe ratio is calculated, and the detection target is judged whether it is lost. Combined with the Kalman algorithm, the position of the next frame of the target can be predicted according to the state before the target is lost. In this paper, six sets of occlusion test videos are selected on the public database visual tracker benchmark for experiments. In order to verify the effectiveness of the proposed algorithm, the authors use Matlab2018b programming, and select DSST, ECO, KCF, LDES, SRDCF, SAMF and STRCF as a comparison algorithm, which has good performance. The final experimental results show that the proposed method improves the accuracy of the algorithm when the target is occluded.

  • 加载中
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Figures(7)

Tables(4)

Article Metrics

Article views(6552) PDF downloads(2244) Cited by(0)

Access History

Other Articles By Authors

Article Contents

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint