Hi, I've got a little mental block with something I'm working on and was hoping someone could help out: I'm doing some data acquisition with a National Instruments PCI-6025E board using LabView (not a lot of experience with LabView by the way). Anyway, the board is supposed to be a 12-bit board and the input range (for the analog inputs) is supposed to be -10 to +10 V. If I do the math, my resolution should be 20/(2^12) = 4.9 mV, right? Why, then, when I read data using the program does it appear that my resolution is twice that (I see steps in the data recording spaced ~2.5 mV apart). Any hints on what I'm doing wrong? Thanks.