Two bit correlation-an adaptive time delay estimation |
| |
Authors: | Liang-Min Wang Shung K.K. Camps O.L. |
| |
Affiliation: | Dept. of Comput. Sci. & Eng., Pennsylvania State Univ., University Park, PA; |
| |
Abstract: | Time delay estimation is a very important operation in ultrasound time-domain flow mapping and correction of phase aberration of an array transducer. As the interest increases in the application of one and a half-dimensional (1.5-D) and two-dimensional (2-D) array transducers to improving image quality and three-dimensional (3-D) imaging, the need of simple, fast, and sufficiently accurate algorithms for real-time time delay estimation becomes exceedingly crucial. In this paper, we present an adaptive time-delay estimation algorithm which minimizes the problem of noise sensitivity associated with the one bit correlation while retaining simplicity in implementation. This algorithm converts each sample datum into a two bit representation including the sign of the sample and an adaptively selected threshold. A bit pattern correlation operation is applied to find the time delay between two engaged signals. By using the criterion of misregistration as an indicator, we are able to show that the proposed algorithm is better than one bit correlation in susceptibility to noise level. Analytical results show that the improvement in reducing misregistration of the two bit correlation over its counterpart is consistent over a wide range of noise level. This is achieved by an adaptive adjustment of the threshold to accommodate signal corruption due to noise. The analytical results are corroborated by results from simulating the blood as a random distribution of red blood cells. Finally, we also present a memory-based architecture to implement the two bit correlation algorithm whose computation time does not depend upon the time delay of the signals to be correlated |
| |
Keywords: | |
|
|