Hi - I am by no means an EE, so please be aware I may have some fundamental flaws in my knowlege. I have a battery powered isoluation unit that takes some voltage in and provides either a voltage or current source output. I am giving it 1V as an input and according to the conversion ratio, I should be getting 1mA out, however I am having trouble measuring it. I took a 100kOhm resistor and connected the two output leads of the isoluation unit (red and black) to the resistor (so now the resistor is in series with the output of the isolation unit). Then I took another set of leads (red and black with a BNC connection) and plugged it up to my oscilloscope and set the input impedance of the scope to be 50Ohms. This gives me a measure of the voltage across the resistor. I know that using V=IR I should be able to take the voltage shown on my oscilloscope, divide it by 100kOhm and get my current value. However when I do this I get a reading of 50mV on my oscilloscope, which means my current is 0.0005 mA. But this is not right since I should be getting 1mA out. I checked the 1V input I am giving the isolation unit and it is showing up as a 1V signal on my oscilloscope. I am confident I am doing something wrong in measuring this current but am not sure what. Help/advice?