Linear LMS Compensation for Timing Mismatch in Time-Interleaved ADCs |
| |
Abstract: | The time-interleaved architecture permits the implementation of high-frequency analog-to-digital converters (ADCs) by multiplexing the output of several time-shifted low-frequency ADCs. An issue in the design of a time-interleaved ADC is the compensation of timing mismatch, which is the difference between the ideal and real sampling instants. In this paper, we propose a compensation method that, as opposite to existing approaches, does not assume that the input signal is band limited but assumes instead that it has a stationary known power spectrum. The compensation is then designed in a statistically optimal sense. This largely reduces the compensation order required to achieve a given reconstruction accuracy. Also, under the band-limited assumption, the proposed method achieves perfect reconstruction if no constraints are imposed on the order of the compensation. Simulation results show that a rough estimate of the input spectrum can be used without much performance loss, showing that an accurate knowledge of the input spectrum is not necessarily required. |
| |
Keywords: | |
|
|