Max Input Change for 1 Bit Output in 12-Bit ADC: 0.50037V

In summary, the maximum input change required to give a one bit change in output of the ADC is 0.50037V.
  • #1
cjm181
69
1

Homework Statement


  1. The full range input of a12-bit, successive-approximation type ADC is 1 volt. Determine:

    a) the maximum input change required to give a one bit change in output of the ADC

Homework Equations


None

The Attempt at a Solution


The ADC is 12 bit, so it will produce a reading between 000000000000 (12 zeros) to 111111111111 (12 ones)

output 000000000000 being 0 in decimal, 0V input
output 111111111111 being 4095 in decimal, 1V input

So looking at the output, the maximum output change that could occur by changing 1 of the bits is 100000000000 (1 one and 11 zeros)

output 100000000000 is 2048.

So 2048 / 4095 = 0.500122

1V * 0.500122 = 0.500122V

So i initially thought that's the answer, the maximum input change to give a 1bit change in output would be 0.500122V.

However, this ADC is rounding. So i understand that the DAC in the ADC produces a binary number. This is then converted back do an analogue value and checked against the measured value. if this estimated value is equal to or less than the measured input, then the comparator would output a 1, if its more then it will output a 0.

so, i think now we are concerned with the last bit in the chain. being 2^0. So, if the measured input is 0.99 or less, then ADC will output a zero for the last bit. So we need to convert this 0.99 into our 1V scale.

so 0.99/4095 = 0.000248.

1V * 0.000248 = 0.000248V

So 0.500122 + 0.000248 = 0.50037V.

So my answer now is 0.50037V, meaning the max the input can change to give a 1bit change in output is 0.50037V.

Is this in the ball park, or am i way off on a tangent.

Kr
Craig
 
Physics news on Phys.org
  • #2
Your interpretation of the question is interesting. Commonly, a 1-bit change in the output of an ADC refers to the LSB, not the MSB. But if the question is taken literally, you are right - the voltage change would be about 0.500VDC (or, as you state, 0.500122VDC).

As far as the additional error is concerned, it could be off by as much as twice the non-linearity specified for the part.
 
  • #3
Hey Scott
Thanks for your reply. It is strange. The worked example in the literature is for a minimum change in output. Whats thrown me is the book specifically says least amount of change to produce a 1 bit change. But the question here clearly states maximum change in input to produce a 1 bit change in output.

But agree what is the purpose of knowing what maximum change would produce a 1bit change? unless the question is trying to get us to show our understanding of the binary system.

I have asked uni to confirm if they want MSB or LSB. It may be the least significant bit, but the catch might be with taking into account the rounding errors.

Has anyone else done this Q?
 
  • #4
I read the question differently. It asks for the maximum voltage necessary to change the output by 1 bit. They mean 1 LSB.

So we start with some dc input voltage and gradually increase (or decrease) the voltage until the voltage change suffices to up (or down) the output by 1 LSB. And that maximum voltage is 1V/4096. (The worst-case point is just above the last trip point before the input was raised).
 

1. What is a 12 bit ADC output change?

A 12 bit ADC output change refers to the change in the digital output of an analog-to-digital converter (ADC) that has a resolution of 12 bits. This means that the analog input signal is divided into 4096 discrete digital levels, with each level representing a small change in the input signal.

2. How accurate is a 12 bit ADC output change?

A 12 bit ADC has a resolution of 4096 levels, which means it can accurately measure changes in the input signal down to 1/4096th of the full-scale range. This results in a high level of accuracy, making 12 bit ADCs commonly used in scientific and engineering applications.

3. What factors can affect a 12 bit ADC output change?

There are several factors that can affect the output change of a 12 bit ADC, including noise, temperature, and power supply stability. Noise can introduce errors in the ADC output, while temperature and power supply fluctuations can cause the ADC to operate outside of its specified range, leading to inaccurate readings.

4. How is a 12 bit ADC output change calculated?

The output change of a 12 bit ADC is calculated by dividing the full-scale range of the ADC by 4096, the number of levels. For example, if the ADC has a full-scale range of 5 volts, the output change would be 5 volts/4096 = 0.00122 volts per level.

5. What are the advantages of using a 12 bit ADC output change?

A 12 bit ADC output change offers a higher resolution compared to lower bit ADCs, allowing for more accurate measurements. It also provides a wider dynamic range, meaning it can detect small changes in low-level signals as well as large changes in high-level signals. Additionally, 12 bit ADCs are more cost-effective than higher bit ADCs, making them a popular choice for many applications.

Similar threads

  • Engineering and Comp Sci Homework Help
Replies
10
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
9
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
31
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
1K
  • Electrical Engineering
Replies
22
Views
6K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
8
Views
2K
  • Electrical Engineering
Replies
6
Views
853
Replies
9
Views
1K
  • Introductory Physics Homework Help
Replies
10
Views
1K
Back
Top