# Analog to Digital Converter (Estimation of Input Voltage)

1. Sep 15, 2008

### GreenLRan

1. The problem statement, all variables and given/known data

A 0-10 V, 10-bit A/D converter displays an output in straight binary code of 1010110111. Estimate the input voltage to within 1 LSB (Least Significant Bit).

2. Relevant equations

Resolution = Efsr/2^M. (where M= 10 in this case)

Eout = X/2^M (where Eout is the output Voltage, X is the actual input binary number [I'm also not sure if this equation is valid... I found it under the digital to analog converter section rather than analog to digital])

3. The attempt at a solution

I calculated the resolution and got it to be 10 V/2^10 = 9.77 mV. I am unsure of what to do next (my book is terrible). I was thinking of converting the output binary (which i believe to be the output voltage) to base 10, then I didn't know if there was an equation for the input Voltage as a function of Output voltage.

I also thought about taking the quantization error to be 1/2 the resolution. the somehow getting the input voltage from that.

Honestly, I am lost. Any help would be great. Thanks!

2. Sep 16, 2008

### chroot

Staff Emeritus
You're making this too hard.

You found the resolution, which is 9.77 mV per code. An input of 0 V to 9.77 mV would produce an output of 00 0000 0000. An input of 9.77 mV to 19.54 mV would produce an output of 00 0000 0001. Each additional step represents 9.77 mV.

Convert the 10-bit output code to decimal -- that's the number of codes above zero -- and multiply it by 9.77 mV. That's the boundary between the given 10-bit code and the next one up.

You might want to draw a picture of the transfer function (not all 2^10 codes of course) to help you visualize the behavior.

- Warren