1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

12 Bit ADC Output Change

  1. May 26, 2016 #1
    1. The problem statement, all variables and given/known data
    1. The full range input of a12-bit, successive-approximation type ADC is 1 volt. Determine:

      a) the maximum input change required to give a one bit change in output of the ADC

    2. Relevant equations
    None

    3. The attempt at a solution
    The ADC is 12 bit, so it will produce a reading between 000000000000 (12 zeros) to 111111111111 (12 ones)

    output 000000000000 being 0 in decimal, 0V input
    output 111111111111 being 4095 in decimal, 1V input

    So looking at the output, the maximum output change that could occur by changing 1 of the bits is 100000000000 (1 one and 11 zeros)

    output 100000000000 is 2048.

    So 2048 / 4095 = 0.500122

    1V * 0.500122 = 0.500122V

    So i initially thought thats the answer, the maximum input change to give a 1bit change in output would be 0.500122V.

    However, this ADC is rounding. So i understand that the DAC in the ADC produces a binary number. This is then converted back do an analogue value and checked against the measured value. if this estimated value is equal to or less than the measured input, then the comparator would output a 1, if its more then it will output a 0.

    so, i think now we are concerned with the last bit in the chain. being 2^0. So, if the measured input is 0.99 or less, then ADC will output a zero for the last bit. So we need to convert this 0.99 into our 1V scale.

    so 0.99/4095 = 0.000248.

    1V * 0.000248 = 0.000248V

    So 0.500122 + 0.000248 = 0.50037V.

    So my answer now is 0.50037V, meaning the max the input can change to give a 1bit change in output is 0.50037V.

    Is this in the ball park, or am i way off on a tangent.

    Kr
    Craig
     
  2. jcsd
  3. May 26, 2016 #2
    Your interpretation of the question is interesting. Commonly, a 1-bit change in the output of an ADC refers to the LSB, not the MSB. But if the question is taken literally, you are right - the voltage change would be about 0.500VDC (or, as you state, 0.500122VDC).

    As far as the additional error is concerned, it could be off by as much as twice the non-linearity specified for the part.
     
  4. May 26, 2016 #3
    Hey Scott
    Thanks for your reply. It is strange. The worked example in the literature is for a minimum change in output. Whats thrown me is the book specifically says least amount of change to produce a 1 bit change. But the question here clearly states maximum change in input to produce a 1 bit change in output.

    But agree what is the purpose of knowing what maximum change would produce a 1bit change? unless the question is trying to get us to show our understanding of the binary system.

    I have asked uni to confirm if they want MSB or LSB. It may be the least significant bit, but the catch might be with taking into account the rounding errors.

    Has anyone else done this Q?
     
  5. May 28, 2016 #4

    rude man

    User Avatar
    Homework Helper
    Gold Member

    I read the question differently. It asks for the maximum voltage necessary to change the output by 1 bit. They mean 1 LSB.

    So we start with some dc input voltage and gradually increase (or decrease) the voltage until the voltage change suffices to up (or down) the output by 1 LSB. And that maximum voltage is 1V/4096. (The worst-case point is just above the last trip point before the input was raised).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted