首页 | 本学科首页   官方微博 | 高级检索  
     

基于成像声呐的水下多目标跟踪研究
引用本文:荆丹翔,韩军,徐志伟,陈鹰.基于成像声呐的水下多目标跟踪研究[J].浙江大学学报(自然科学版 ),2019,53(4):753-760.
作者姓名:荆丹翔  韩军  徐志伟  陈鹰
作者单位:浙江大学 海洋学院,浙江 舟山 316021
摘    要:针对水下多目标跟踪问题,提出基于成像声呐的高效目标跟踪算法. 基于声呐的成像特点,针对声学图像中的每个像素点,建立基于信号强度的回波信号模型,提取图像中的个体目标. 采用基于序贯蒙特卡罗的概率密度假设(SMCPHD)滤波对各目标状态进行滤波,结合Auction航迹识别算法将滤波后的目标状态与已确定的航迹进行关联,实现多目标跟踪. 通过算法的仿真分析发现,该方法相对于基于数据关联型的多目标跟踪算法如联合概率数据关联(JPDA)算法、多假设跟踪(MHT)算法,大大提高了计算效率. 对采集的现场数据进行目标提取与跟踪,获得目标的跟踪轨迹.

关 键 词:成像声呐  目标提取  多目标跟踪  航迹识别  目标轨迹  

Underwater multi-target tracking using imaging sonar
Dan-xiang JING,Jun HAN,Zhi-wei XU,Ying CHEN.Underwater multi-target tracking using imaging sonar[J].Journal of Zhejiang University(Engineering Science),2019,53(4):753-760.
Authors:Dan-xiang JING  Jun HAN  Zhi-wei XU  Ying CHEN
Abstract:An efficient target tracking algorithm based on an imaging sonar was proposed to solve the problem of underwater multi-target tracking. The echo signal model based on signal intensity was established for each pixel point in the acoustic image according to the imaging features of the sonar in order to extract the individual target from the images. The sequential Monte Carlo probability hypothesis density (SMCPHD) filtering was applied to the target states. The Auction track recognition algorithm was used to associate the filtered target states with the identified tracks, so that the multi-target tracking was realized. The simulation analysis of the algorithm showed that the proposed method was more efficient than the multi-target tracking algorithms based on data correlation, eg. joint probabilistic data association (JPDA) and multiple hypothesis tracking (MHT). A field experiment was conducted to collect the sonar data. The tracking trajectories of all the targets were obtained after the target extraction and tracking.
Keywords:imaging sonar  target extraction  multi-target tracking  track recognition  target trajectory  
本文献已被 CNKI 等数据库收录!
点击此处可从《浙江大学学报(自然科学版 )》浏览原始摘要信息
点击此处可从《浙江大学学报(自然科学版 )》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号