ADC Signal-to-Noise Ratio (SNR)

If an Alternate Current (AC) signal is applied to an ideal Analog-to-Digital Converter (ADC), noise present in the digitized output will be due toquantization error. For the ideal converter, the maximum error for any given input will be +/- ½ Least Significant Bit (LSB). If a linear ramp signal is applied to the converter input and the output error is plotted for all analog inputs, the result will be a sawtooth waveform with a peak-to-peak value of 1 LSB as shown in the figure below:

adc-snr.PNG

The Root-Mean-Square (RMS) amplitude of the error output can be approximated by the equation below.

(1)
\begin{align} ERROR_{RMS} = 1/( \sqrt{12}) • 1 LSB \end{align}

The maximum theoretical Signal-to-Noise Ratio (SNR) for an ADC can be determined based on the RMS quantization error determined above. If a Full-Scale (FS) sine wave is applied to the input of the ADC, the maximum theoreticalSNR is determined by the equation below, where N is the resolution of the ADC in bits. The above formula assumes that the signal noise is measured over the entire usable bandwidth of the ADC (0 - fs/2), where fs = sampling frequency. For the case of oversampling, where the signal bandwidth is less than the Nyquist bandwidth, the theoreticalSNR of the ADC is increased by 3 dB each time the fs is doubled.

(2)
\begin{equation} SNR = 6.02 • N +1.76dB \end{equation}