She, YingYi, YangGu, JialiangEisemann, Elmar and Jacobson, Alec and Zhang, Fang-Lue2020-10-292020-10-2920201467-8659https://doi.org/10.1111/cgf.14153https://diglib.eg.org:443/handle/10.1111/cgf14153Correlation filters (CF) achieve excellent performance in visual tracking but suffer from undesired boundary effects. A significant amount of approaches focus on enlarging search regions to make up for this shortcoming. However, this introduces excessive background noises and misleads the filter into learning from the ambiguous information. In this paper, we propose a novel target-adaptive correlation filter (TACF) that incorporates context and spatial-temporal regularizations into the CF framework, thus learning a more robust appearance model in the case of large appearance variations. Besides, it can be effectively optimized via the alternating direction method of multipliers(ADMM), thus achieving a global optimal solution. Finally, an adaptive updating strategy is presented to discriminate the unreliable samples and alleviate the contamination of these training samples. Extensive evaluations on OTB-2013, OTB-2015, VOT-2016, VOT-2017 and TC-128 datasets demonstrate that our TACF is very promising for various challenging scenarios compared with several state-of-the-art trackers, with real-time performance of 20 frames per second(fps).Computing methodologiesTrackingMotion captureAppearance and texture representationsLearning Target-Adaptive Correlation Filters for Visual Tracking10.1111/cgf.14153387-397