• Support PF! Buy your school textbooks, materials and every day products Here!

12 Bit ADC Output Change

  • Thread starter cjm181
  • Start date
  • #1
69
1

Homework Statement


  1. The full range input of a12-bit, successive-approximation type ADC is 1 volt. Determine:

    a) the maximum input change required to give a one bit change in output of the ADC

Homework Equations


None

The Attempt at a Solution


The ADC is 12 bit, so it will produce a reading between 000000000000 (12 zeros) to 111111111111 (12 ones)

output 000000000000 being 0 in decimal, 0V input
output 111111111111 being 4095 in decimal, 1V input

So looking at the output, the maximum output change that could occur by changing 1 of the bits is 100000000000 (1 one and 11 zeros)

output 100000000000 is 2048.

So 2048 / 4095 = 0.500122

1V * 0.500122 = 0.500122V

So i initially thought thats the answer, the maximum input change to give a 1bit change in output would be 0.500122V.

However, this ADC is rounding. So i understand that the DAC in the ADC produces a binary number. This is then converted back do an analogue value and checked against the measured value. if this estimated value is equal to or less than the measured input, then the comparator would output a 1, if its more then it will output a 0.

so, i think now we are concerned with the last bit in the chain. being 2^0. So, if the measured input is 0.99 or less, then ADC will output a zero for the last bit. So we need to convert this 0.99 into our 1V scale.

so 0.99/4095 = 0.000248.

1V * 0.000248 = 0.000248V

So 0.500122 + 0.000248 = 0.50037V.

So my answer now is 0.50037V, meaning the max the input can change to give a 1bit change in output is 0.50037V.

Is this in the ball park, or am i way off on a tangent.

Kr
Craig
 

Answers and Replies

  • #2
.Scott
Homework Helper
2,443
846
Your interpretation of the question is interesting. Commonly, a 1-bit change in the output of an ADC refers to the LSB, not the MSB. But if the question is taken literally, you are right - the voltage change would be about 0.500VDC (or, as you state, 0.500122VDC).

As far as the additional error is concerned, it could be off by as much as twice the non-linearity specified for the part.
 
  • #3
69
1
Hey Scott
Thanks for your reply. It is strange. The worked example in the literature is for a minimum change in output. Whats thrown me is the book specifically says least amount of change to produce a 1 bit change. But the question here clearly states maximum change in input to produce a 1 bit change in output.

But agree what is the purpose of knowing what maximum change would produce a 1bit change? unless the question is trying to get us to show our understanding of the binary system.

I have asked uni to confirm if they want MSB or LSB. It may be the least significant bit, but the catch might be with taking into account the rounding errors.

Has anyone else done this Q?
 
  • #4
rude man
Homework Helper
Insights Author
Gold Member
7,627
714
I read the question differently. It asks for the maximum voltage necessary to change the output by 1 bit. They mean 1 LSB.

So we start with some dc input voltage and gradually increase (or decrease) the voltage until the voltage change suffices to up (or down) the output by 1 LSB. And that maximum voltage is 1V/4096. (The worst-case point is just above the last trip point before the input was raised).
 

Related Threads for: 12 Bit ADC Output Change

  • Last Post
Replies
6
Views
3K
Replies
0
Views
3K
Replies
3
Views
683
Replies
9
Views
668
  • Last Post
Replies
10
Views
947
Replies
5
Views
472
Top