Abstract: Time-interleaved analog to digital converters (TI-ADC) offer high sampling rates by passing the input signal through C parallel low-rate ADCs. We can achieve C-times the sampling rate of a single ADC if all the shifts between the channels are identical. In practice, however, it is not possible to avoid mismatch among shifts. Besides, the samples are also subject to jitter noise. In this paper, we propose a blind method to mitigate the joint effects of sampling jitter and shift mismatch in the TI-ADC structure. We assume the input signal to be bandlimited and incorporate the jitter via a stochastic model. Next, we derive an approximate model based on a first-order Taylor series and use an iterative maximum likelihood estimator to reconstruct the uniform samples of the input signal. The simulation results show that with a slight increase in the mean square-error, we obtain a fast blind compensation algorithm.
Loading