Real-time and robust visual tracking with scene-perceptual memory |
| |
Affiliation: | 1. Department of Electrical Engineering, University of Souk Ahras, 41000 Souk Ahras, Algeria;2. Department of Computer Science, University of Souk Ahras, 41000 Souk Ahras, Algeria;3. Department of Electronics, Faculty of Engineering Sciences, Badji Mokhtar-Annaba University, Annaba 23000, Algeria |
| |
Abstract: | Unmanned aerial vehicle (UAV) based aerial visual tracking is one of the research hotspots in computer vision. However, the mainstream trackers for UAV still have two shortcomings: (1) the accuracy of correlation filter tracker is greatly improved with more complex model, it impedes accuracy-speed trade-off. (2) object occlusion and camera motion in the aerial tracking scene also seriously restrict the application of aerial tracking. To address these problems, and inspired by AutoTrack tracker, we propose an aerial correlation filtering tracker based on scene-perceptual memory, Fast-AutoTrack. Firstly, to perceive and judge tracking anomalies, such as object occlusion and camera motion, inspired by the peak sidelobe ratio and AutoTrack, a confidence score is designed by perceiving and remembering the changing trend of the confidence and the local historical confidence. Secondly, after tracking anomaly occurring, several search regions are predicted based on the local object motion trend and the Spatio-temporal context information for object re-detection. Finally, to accelerate the model updating, the perceptual hashing algorithm (PHA) is used to obtain the similarity of the search regions between two adjacent frames. On typical aerial tracking datasets UAVDT, UAV123@10fps, and DTB70, Fast-AutoTrack run 71.4% faster than AutoTrack with almost equal accuracy and show favorable accuracy-speed trade-off. |
| |
Keywords: | Visual tracking Correlation filter Scene-perceptual memory Unmanned aerial vehicle Aerial object tracking |
本文献已被 ScienceDirect 等数据库收录! |
|