Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
541986 | Microelectronics Journal | 2010 | 8 Pages |
Abstract
Dither-based digital background calibration algorithm has been used to eliminate the influence of linear and nonlinear errors in pipelined ADC. However, this algorithm suffers from two disadvantages: too slow convergent speed and deduction of transmitting signal’s amplitude in analog circuits due to dither injection. Input-dependent variable-amplitude dither-based algorithm is used in this paper to conquer both disadvantages. This proposed algorithm is implemented in a 14-bit, 100 MHz sample-rate pipelined ADC. The simulation results illustrate signal-to-noise and distortion (SINAD) of 76.56 dB after calibration of linear and nonlinear errors. Furthermore, the convergent speed is improved much more.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Hardware and Architecture
Authors
Shuo Yang, Jun Cheng, Pei Wang,