- The full range input of a12-bit, successive-approximation type ADC is 1 volt. Determine:
a) the maximum input change required to give a one bit change in output of the ADC
The Attempt at a Solution
The ADC is 12 bit, so it will produce a reading between 000000000000 (12 zeros) to 111111111111 (12 ones)
output 000000000000 being 0 in decimal, 0V input
output 111111111111 being 4095 in decimal, 1V input
So looking at the output, the maximum output change that could occur by changing 1 of the bits is 100000000000 (1 one and 11 zeros)
output 100000000000 is 2048.
So 2048 / 4095 = 0.500122
1V * 0.500122 = 0.500122V
So i initially thought that's the answer, the maximum input change to give a 1bit change in output would be 0.500122V.
However, this ADC is rounding. So i understand that the DAC in the ADC produces a binary number. This is then converted back do an analogue value and checked against the measured value. if this estimated value is equal to or less than the measured input, then the comparator would output a 1, if its more then it will output a 0.
so, i think now we are concerned with the last bit in the chain. being 2^0. So, if the measured input is 0.99 or less, then ADC will output a zero for the last bit. So we need to convert this 0.99 into our 1V scale.
so 0.99/4095 = 0.000248.
1V * 0.000248 = 0.000248V
So 0.500122 + 0.000248 = 0.50037V.
So my answer now is 0.50037V, meaning the max the input can change to give a 1bit change in output is 0.50037V.
Is this in the ball park, or am i way off on a tangent.