Show simple item record

dc.contributor.authorShe, Yingen_US
dc.contributor.authorYi, Yangen_US
dc.contributor.authorGu, Jialiangen_US
dc.contributor.editorEisemann, Elmar and Jacobson, Alec and Zhang, Fang-Lueen_US
dc.description.abstractCorrelation filters (CF) achieve excellent performance in visual tracking but suffer from undesired boundary effects. A significant amount of approaches focus on enlarging search regions to make up for this shortcoming. However, this introduces excessive background noises and misleads the filter into learning from the ambiguous information. In this paper, we propose a novel target-adaptive correlation filter (TACF) that incorporates context and spatial-temporal regularizations into the CF framework, thus learning a more robust appearance model in the case of large appearance variations. Besides, it can be effectively optimized via the alternating direction method of multipliers(ADMM), thus achieving a global optimal solution. Finally, an adaptive updating strategy is presented to discriminate the unreliable samples and alleviate the contamination of these training samples. Extensive evaluations on OTB-2013, OTB-2015, VOT-2016, VOT-2017 and TC-128 datasets demonstrate that our TACF is very promising for various challenging scenarios compared with several state-of-the-art trackers, with real-time performance of 20 frames per second(fps).en_US
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectComputing methodologies
dc.subjectMotion capture
dc.subjectAppearance and texture representations
dc.titleLearning Target-Adaptive Correlation Filters for Visual Trackingen_US
dc.description.seriesinformationComputer Graphics Forum
dc.description.sectionheadersTracking and Saliency

Files in this item


This item appears in the following Collection(s)

  • 39-Issue 7
    Pacific Graphics 2020 - Symposium Proceedings

Show simple item record