How Does Measurement Sensitivity Affect Current Detection in a Circuit?

AI Thread Summary
Measurement sensitivity in circuits refers to the ratio of change in output to change in input, which is distinct from resolution. In the example provided, with an initial current of 1A and a sensitivity of 1 part in 100, the ammeter cannot accurately detect changes smaller than 0.01A. Therefore, if the current changes from 1A to 1.001A, the ammeter would not display a significant change due to its sensitivity limitations. This confusion highlights the importance of understanding both sensitivity and resolution in measurement instruments. Accurate interpretation of these concepts is crucial for effective circuit analysis.
Andy_ToK
Messages
43
Reaction score
0
Hi,
I have a question about the sensitivity of measurements in experiments.
e.g. We measure the current of a circuit and it's 1A. if the sensitivity is 1 part in 100, does that mean we can't distinguish the current between 1-1/100A and 1+1/100A?

Thanks
 
Physics news on Phys.org
Thanks, that's what i meant
 
Whoops I just told you the wrong thing. I just told you about resolution.

Sensitivity is the change in output/ the change in input.

So if you have a change of 1V in your circuit, the instrument will show a change in 1/100 V.
 
ya, so it can't tell when the change is smaller than 1/100V, right?
 
No, that's not what sensitivity is. Its the ratio of the change in input to the change in output. Thats totally different from what I told you at first, the resolution.
 
Sorry, but I'm confused now.
Let's just stick to the example i mentioned. the initial current is 1A and the sensitivity of the ammeter is 1 part in 100. And if the current changed from 1A to (1+1/1000)A, what value will be displayed on the ammeter?
 
anyone?
Thanks
 
Back
Top