1. The problem statement, all variables and given/known data A 0-10 V, 10-bit A/D converter displays an output in straight binary code of 1010110111. Estimate the input voltage to within 1 LSB (Least Significant Bit). 2. Relevant equations Resolution = Efsr/2^M. (where M= 10 in this case) Eout = X/2^M (where Eout is the output Voltage, X is the actual input binary number [I'm also not sure if this equation is valid... I found it under the digital to analog converter section rather than analog to digital]) 3. The attempt at a solution I calculated the resolution and got it to be 10 V/2^10 = 9.77 mV. I am unsure of what to do next (my book is terrible). I was thinking of converting the output binary (which i believe to be the output voltage) to base 10, then I didn't know if there was an equation for the input Voltage as a function of Output voltage. I also thought about taking the quantization error to be 1/2 the resolution. the somehow getting the input voltage from that. Honestly, I am lost. Any help would be great. Thanks!