首页 | 本学科首页   官方微博 | 高级检索  
     

基于小波时间延迟估计的宽带信号波束形成算法研究
引用本文:刘影,谢驰.基于小波时间延迟估计的宽带信号波束形成算法研究[J].电子科技大学学报(自然科学版),2018,47(2):183-188.
作者姓名:刘影  谢驰
作者单位:1.电子科技大学能源科学与工程学院 成都 611731
基金项目:国家自然科学基金61401075
摘    要:在阵列信号接收中,由于信号多径效应,阵列天线接收到不同时间延迟的期望信号,这使得高精度的波束形成技术成为信号处理中的一个难点。该文提出了一种考虑时延的波束形成算法。该算法首先通过小波算法对宽带接收信号进行接收信号时间延迟估计,将时延估计值与预存时延估计值进行误差比较,并对接近期望信号方向的空间多波束进行迭代优化,最后实现在空间中形成整体的自适应波束。仿真结果验证了改进算法的有效性。

关 键 词:阵列信号    波束形成    宽带信号    时间延迟估计    小波算法
收稿时间:2017-05-04

Broadband Signal Beamforming Algorithm Based on the Wavelet and Time Delay Estimation
Affiliation:1.School of Energy Science and Engineering, University of Electronic Science and Technology of China Chengdu 6117312.College of Electronics and Information Engineering, Sichuan University Chengdu 610065
Abstract:In the array signal receiving, the different time delay of the desired signal is received by the array antenna because of the signal multipath effect. So high accuracy beamforming with time delay signals is an intractable problem in the signal processing. In this paper, a beamforming algorithm considering the time delay signal is proposed. The time delay estimation value is calculated by the wavelet algorithm at first. And then the time delay estimate is compared with the prestored time delay value to choose the desired signal direction. The spatial multi-beam with approximate desired signal direction is optimized by iteration to implement overall adaptive beam in space. The simulation results show the effectiveness of the improved algorithm.
Keywords:
点击此处可从《电子科技大学学报(自然科学版)》浏览原始摘要信息
点击此处可从《电子科技大学学报(自然科学版)》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号