1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

DC milliammeter as a voltmeter

  1. Sep 29, 2016 #1
    1. The problem statement, all variables and given/known data
    A DC milliammeter has a full-scale deflection of 10 mA and a resistance of 50 Ω. How would you adapt this to serve as a voltmeter with a full-scale deflection of 150 V? Comment on whether this voltmeter would be suitable for accurately measuring the potential difference across a resistor of about 100 kΩ carrying a current of about 1 mA.

    Answers: 14 950 Ω in series.

    2. The attempt at a solution
    R = V / I = 150 / 10 * 10-3 = 15 000 Ω. Less the given 50 Ω = 14 950 Ω are required for this milliammeter to serve as a voltmeter. However, why should it be applied in series? Voltmeters are applied in parallel.

    For the second part, shouldn't voltmeters have large resistances? And in this case the resistor has 100 000 Ω. I would say that this voltmeter wouldn't be suitable to accurately measure the PD across this resistor.
     
  2. jcsd
  3. Sep 29, 2016 #2

    gneill

    User Avatar

    Staff: Mentor

    What would happen if you connected a resistor in parallel instead and you placed this meter across 150 V? How much current would pass through the meter movement?

    Yes voltmeters should have a large resistance. They're trying to get you to discover why that is. What do you think might be the effect on normal circuit operation in the case given?
     
  4. Sep 29, 2016 #3
    If one places in parallel the current is divided in two different parts while when it is in series the current is the same throughout the circuit.

    But it's just a rule that voltmeters should be connected in parallel and things like a DC milliammeter acting like a voltmeter -- such things should be placed in series if they are to behave as voltmeters (and in parallel if as ammeters)?

    The resistor has a large resistance while the voltmeter has a smaller one. Therefore the result will be an inaccurate number.
     
  5. Sep 29, 2016 #4

    gneill

    User Avatar

    Staff: Mentor

    "Just a rule"? It's not just a strong suggestion, it's a requirement. Think about what is being measured by a voltmeter and an ammeter.

    A voltage is a potential difference between two locations. So you need to place a probe at each location to take the measurement.

    A current is a flow of charge through something (a wire, a component, ...). As such you need to insert the measuring device into the flow and have it pass through the device to "count" it.
    Right. Because the original resistor and the smaller voltmeter resistance will be in parallel, changing how the original circuit operates. If it was designed to operate correctly with a 100K resistor at that location, and it suddenly becomes less than 15K when paralleled by the voltmeter, you'll no longer be measuring what the original circuit was doing.
     
  6. Sep 29, 2016 #5
    What I meant was the following: is it a rule that while voltmeters and ammeters are placed in parallel and in series respectively, things like a DC milliammeter acting like a voltmeter or an ammeter are connected the other way around -- in series and in parallel. Is it a rule as well?

    But why are we talking about a parallel voltmeter? In this situation we have a voltmeter in series. My phrase was about the given voltmeter in series. Since it has a small resistance it will result in an inaccurate number.
     
  7. Sep 29, 2016 #6

    gneill

    User Avatar

    Staff: Mentor

    It's a matter of practical design and circuit laws. If your meter movement is as described in this problem, a milliameter that has a 50 Ohm coil resistance and it takes 10 mA for a full scale reading, the maximum voltage it can read is 0.5 Volts. If you connect it to anything greater it is likely to burn out. So the idea of putting a resistor in series with it to limit the current and form a voltage divider makes sense. Putting a resistor in parallel with it won't work since both the resistor and meter will see the same voltage (circuit rule for parallel components).

    For an ammeter, you want to bypass some of the current around the meter movement to measure larger currents. That means providing a parallel path around the meter movement, forming a current divider. THis also has the benefit of lowering the net resistance of the 'meter', since resistors in parallel results in a net resistance smaller than the smallest resistance in the parallel group. An ideal ammeter would have zero resistance.
    No, the voltmeter is comprised of a meter movement in series with a resistance (or it could be a more complex voltage divider, but that's just details). It's this overall unit which we call a voltmeter that is placed in parallel with something to measure the potential difference.

    upload_2016-9-29_16-15-7.png
     
  8. Sep 30, 2016 #7
    Well, as I see from your graph, you are talking about a voltmeter that is connected in parallel. However, in our case we have a DC milliammeter that acts like a voltmeter and that we connect in series, that's what you explained in this part:
    And so regarding this part:
    If we use this DC milliammeter that acts as a voltmeter and that we will connect in series with a resistor of 100 000 Ω and a current of about 10-3 A -- how accurate will this DC milliammeter be in measuring the potential difference across this resistor? As I said, in this case -- it will not be as accurate as required since the resistance of the resistor is 100 000 Ohm, while we have a DC milliammeter that can withstand a resistance of 15 000 Ohm.

    I also think that if we had a generic voltmeter that is connected in parallel with a resistance of 15 000 Ohm and a resistor of 100 000 Ohm, that generic voltmeter would also give an inaccurate result.

    So my question is: is this logic correct? And if not, what part of it is wrong?
     
  9. Sep 30, 2016 #8

    gneill

    User Avatar

    Staff: Mentor

    This is not the scenario that the problem statement poses. The problem statement has you "build" a voltmeter that consists of the given milliameter with a series resistance of 14950 Ω. This voltmeter is designed to give a full-scale reading when connected across a 170 V source. It is proposed to use this voltmeter to measure the potential difference across a 100 KΩ resistor that is carrying 1 mA of current. So the circuit, under normal operation, would have (1 mA)(100 KΩ) = 100 V across that resistor.

    The voltmeter is placed in parallel with the 100 KΩ resistor in order to measure the potential difference across it. It is the setup indicated in my figure in post #6.

    The first thing you should note is that the milliameter requires 10 mA to give a full-scale reading, but the circuit to be measured is only carrying 1 mA. That should send up a warning flag right there! There's no way to get a significant meter deflection without drawing significantly more current than the circuit to be measured carries normally. Placing the voltmeter across the resistor will disturb the normal circuit operation.

    What does "withstand a resistance of 15 000 Ohm" mean?

    Think about what the circuit under test sees when the voltmeter is connected across its resistor. When the voltmeter is placed across the 100 KΩ resistor the circuit "sees" that resistor go from 100 KΩ to less than 15 KΩ (close to 13 KΩ actually). Why might this be a problem for the person testing the circuit?
     
  10. Sep 30, 2016 #9
    I don't understand this. What is a series resistance? In the next paragraph you say that the voltmeter is placed in parallel.

    We have a resistor, that is denoted V to which a voltmeter is connected in parallel. We also have the voltmeter divided in two parts. By "series resistance" you mean Rs? But in that case it is placed in series with the voltmeter but in parallel to the rest of the circuit.

    This point I understand.
     
  11. Sep 30, 2016 #10

    gneill

    User Avatar

    Staff: Mentor

    The resistor ##R_s## in the figure is in series with the meter movement inside the voltmeter.
    Yes, exactly.
     
  12. Sep 30, 2016 #11
    So, in sum we can say that the current is too low (when 10 mA are required instead of 1 mA) and the resistance of the resistor is too large (100 kOhm instead of less than 15 kOhm). And this will lead to unaccurate test results.
     
  13. Sep 30, 2016 #12

    gneill

    User Avatar

    Staff: Mentor

    That's it. And to summarize your summary :smile:, low resistance voltmeters place a load on the circuit being measured, disturbing the normal circuit operation thus making readings inaccurate.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: DC milliammeter as a voltmeter
  1. Tricky voltmeter (Replies: 4)

  2. Reading of voltmeter (Replies: 13)

Loading...