How Can a DC Milliammeter Be Adapted to Measure 150V Accurately?

  • Thread starter Thread starter moenste
  • Start date Start date
  • Tags Tags
    Dc Voltmeter
Click For Summary
SUMMARY

The discussion focuses on adapting a DC milliammeter with a full-scale deflection of 10 mA and a resistance of 50 Ω to function as a voltmeter capable of measuring 150 V. To achieve this, a series resistance of 14,950 Ω is required, calculated using the formula R = V / I. The participants emphasize that voltmeters must be connected in parallel to accurately measure potential differences, and using the milliammeter in series would lead to inaccurate readings, particularly when measuring across a 100 kΩ resistor carrying 1 mA, due to the significant impact on circuit operation.

PREREQUISITES
  • Understanding of Ohm's Law (V = IR)
  • Knowledge of series and parallel circuit configurations
  • Familiarity with the operation of DC milliammeter and voltmeter
  • Concept of voltage dividers in electrical circuits
NEXT STEPS
  • Research the design and function of voltage dividers in circuits
  • Learn about the characteristics and limitations of DC milliammeter
  • Study the impact of meter loading on circuit measurements
  • Explore the principles of accurate voltage measurement techniques
USEFUL FOR

Electronics students, electrical engineers, and technicians involved in circuit design and measurement, particularly those interested in adapting measuring instruments for specific applications.

moenste
Messages
711
Reaction score
12

Homework Statement


A DC milliammeter has a full-scale deflection of 10 mA and a resistance of 50 Ω. How would you adapt this to serve as a voltmeter with a full-scale deflection of 150 V? Comment on whether this voltmeter would be suitable for accurately measuring the potential difference across a resistor of about 100 kΩ carrying a current of about 1 mA.

Answers: 14 950 Ω in series.

2. The attempt at a solution
R = V / I = 150 / 10 * 10-3 = 15 000 Ω. Less the given 50 Ω = 14 950 Ω are required for this milliammeter to serve as a voltmeter. However, why should it be applied in series? Voltmeters are applied in parallel.

For the second part, shouldn't voltmeters have large resistances? And in this case the resistor has 100 000 Ω. I would say that this voltmeter wouldn't be suitable to accurately measure the PD across this resistor.
 
Physics news on Phys.org
moenste said:
R = V / I = 150 / 10 * 10-3 = 15 000 Ω. Less the given 50 Ω = 14 950 Ω are required for this milliammeter to serve as a voltmeter. However, why should it be applied in series? Voltmeters are applied in parallel.
What would happen if you connected a resistor in parallel instead and you placed this meter across 150 V? How much current would pass through the meter movement?

moenste said:
For the second part, shouldn't voltmeters have large resistances? And in this case the resistor has 100 000 Ω. I would say that this voltmeter wouldn't be suitable to accurately measure the PD across this resistor.
Yes voltmeters should have a large resistance. They're trying to get you to discover why that is. What do you think might be the effect on normal circuit operation in the case given?
 
  • Like
Likes   Reactions: moenste
gneill said:
What would happen if you connected a resistor in parallel instead and you placed this meter across 150 V? How much current would pass through the meter movement?
If one places in parallel the current is divided in two different parts while when it is in series the current is the same throughout the circuit.

But it's just a rule that voltmeters should be connected in parallel and things like a DC milliammeter acting like a voltmeter -- such things should be placed in series if they are to behave as voltmeters (and in parallel if as ammeters)?

gneill said:
Yes voltmeters should have a large resistance. They're trying to get you to discover why that is. What do you think might be the effect on normal circuit operation in the case given?
The resistor has a large resistance while the voltmeter has a smaller one. Therefore the result will be an inaccurate number.
 
moenste said:
If one places in parallel the current is divided in two different parts while when it is in series the current is the same throughout the circuit.

But it's just a rule that voltmeters should be connected in parallel and things like a DC milliammeter acting like a voltmeter -- such things should be placed in series if they are to behave as voltmeters (and in parallel if as ammeters)?
"Just a rule"? It's not just a strong suggestion, it's a requirement. Think about what is being measured by a voltmeter and an ammeter.

A voltage is a potential difference between two locations. So you need to place a probe at each location to take the measurement.

A current is a flow of charge through something (a wire, a component, ...). As such you need to insert the measuring device into the flow and have it pass through the device to "count" it.
The resistor has a large resistance while the voltmeter has a smaller one. Therefore the result will be an inaccurate number.
Right. Because the original resistor and the smaller voltmeter resistance will be in parallel, changing how the original circuit operates. If it was designed to operate correctly with a 100K resistor at that location, and it suddenly becomes less than 15K when paralleled by the voltmeter, you'll no longer be measuring what the original circuit was doing.
 
  • Like
Likes   Reactions: moenste
gneill said:
"Just a rule"? It's not just a strong suggestion, it's a requirement. Think about what is being measured by a voltmeter and an ammeter.
What I meant was the following: is it a rule that while voltmeters and ammeters are placed in parallel and in series respectively, things like a DC milliammeter acting like a voltmeter or an ammeter are connected the other way around -- in series and in parallel. Is it a rule as well?

gneill said:
Right. Because the original resistor and the smaller voltmeter resistance will be in parallel, changing how the original circuit operates. If it was designed to operate correctly with a 100K resistor at that location, and it suddenly becomes less than 15K when paralleled by the voltmeter, you'll no longer be measuring what the original circuit was doing.
But why are we talking about a parallel voltmeter? In this situation we have a voltmeter in series. My phrase was about the given voltmeter in series. Since it has a small resistance it will result in an inaccurate number.
 
moenste said:
What I meant was the following: is it a rule that while voltmeters and ammeters are placed in parallel and in series respectively, things like a DC milliammeter acting like a voltmeter or an ammeter are connected the other way around -- in series and in parallel. Is it a rule as well?
It's a matter of practical design and circuit laws. If your meter movement is as described in this problem, a milliameter that has a 50 Ohm coil resistance and it takes 10 mA for a full scale reading, the maximum voltage it can read is 0.5 Volts. If you connect it to anything greater it is likely to burn out. So the idea of putting a resistor in series with it to limit the current and form a voltage divider makes sense. Putting a resistor in parallel with it won't work since both the resistor and meter will see the same voltage (circuit rule for parallel components).

For an ammeter, you want to bypass some of the current around the meter movement to measure larger currents. That means providing a parallel path around the meter movement, forming a current divider. THis also has the benefit of lowering the net resistance of the 'meter', since resistors in parallel results in a net resistance smaller than the smallest resistance in the parallel group. An ideal ammeter would have zero resistance.
But why are we talking about a parallel voltmeter? In this situation we have a voltmeter in series. My phrase was about the given voltmeter in series. Since it has a small resistance it will result in an inaccurate number.
No, the voltmeter is comprised of a meter movement in series with a resistance (or it could be a more complex voltage divider, but that's just details). It's this overall unit which we call a voltmeter that is placed in parallel with something to measure the potential difference.

upload_2016-9-29_16-15-7.png
 
  • Like
Likes   Reactions: moenste
gneill said:
No, the voltmeter is comprised of a meter movement in series with a resistance (or it could be a more complex voltage divider, but that's just details). It's this overall unit which we call a voltmeter that is placed in parallel with something to measure the potential difference.
Well, as I see from your graph, you are talking about a voltmeter that is connected in parallel. However, in our case we have a DC milliammeter that acts like a voltmeter and that we connect in series, that's what you explained in this part:
gneill said:
It's a matter of practical design and circuit laws. If your meter movement is as described in this problem, a milliameter that has a 50 Ohm coil resistance and it takes 10 mA for a full scale reading, the maximum voltage it can read is 0.5 Volts. If you connect it to anything greater it is likely to burn out. So the idea of putting a resistor in series with it to limit the current and form a voltage divider makes sense. Putting a resistor in parallel with it won't work since both the resistor and meter will see the same voltage (circuit rule for parallel components).
And so regarding this part:
moenste said:
Comment on whether this voltmeter would be suitable for accurately measuring the potential difference across a resistor of about 100 kΩ carrying a current of about 1 mA.
If we use this DC milliammeter that acts as a voltmeter and that we will connect in series with a resistor of 100 000 Ω and a current of about 10-3 A -- how accurate will this DC milliammeter be in measuring the potential difference across this resistor? As I said, in this case -- it will not be as accurate as required since the resistance of the resistor is 100 000 Ohm, while we have a DC milliammeter that can withstand a resistance of 15 000 Ohm.

I also think that if we had a generic voltmeter that is connected in parallel with a resistance of 15 000 Ohm and a resistor of 100 000 Ohm, that generic voltmeter would also give an inaccurate result.

So my question is: is this logic correct? And if not, what part of it is wrong?
 
moenste said:
If we use this DC milliammeter that acts as a voltmeter and that we will connect in series with a resistor of 100 000 Ω and a current of about 10-3 A -- how accurate will this DC milliammeter be in measuring the potential difference across this resistor?

This is not the scenario that the problem statement poses. The problem statement has you "build" a voltmeter that consists of the given milliameter with a series resistance of 14950 Ω. This voltmeter is designed to give a full-scale reading when connected across a 170 V source. It is proposed to use this voltmeter to measure the potential difference across a 100 KΩ resistor that is carrying 1 mA of current. So the circuit, under normal operation, would have (1 mA)(100 KΩ) = 100 V across that resistor.

The voltmeter is placed in parallel with the 100 KΩ resistor in order to measure the potential difference across it. It is the setup indicated in my figure in post #6.

The first thing you should note is that the milliameter requires 10 mA to give a full-scale reading, but the circuit to be measured is only carrying 1 mA. That should send up a warning flag right there! There's no way to get a significant meter deflection without drawing significantly more current than the circuit to be measured carries normally. Placing the voltmeter across the resistor will disturb the normal circuit operation.

moenste said:
As I said, in this case -- it will not be as accurate as required since the resistance of the resistor is 100 000 Ohm, while we have a DC milliammeter that can withstand a resistance of 15 000 Ohm.
What does "withstand a resistance of 15 000 Ohm" mean?

Think about what the circuit under test sees when the voltmeter is connected across its resistor. When the voltmeter is placed across the 100 KΩ resistor the circuit "sees" that resistor go from 100 KΩ to less than 15 KΩ (close to 13 KΩ actually). Why might this be a problem for the person testing the circuit?
 
  • Like
Likes   Reactions: moenste
gneill said:
a series resistance of 14950 Ω
I don't understand this. What is a series resistance? In the next paragraph you say that the voltmeter is placed in parallel.

gneill said:
The voltmeter is placed in parallel with the 100 KΩ resistor in order to measure the potential difference across it. It is the setup indicated in my figure in post #6.
gneill said:
upload_2016-9-29_16-15-7-png.106694.png
We have a resistor, that is denoted V to which a voltmeter is connected in parallel. We also have the voltmeter divided in two parts. By "series resistance" you mean Rs? But in that case it is placed in series with the voltmeter but in parallel to the rest of the circuit.

gneill said:
The first thing you should note is that the milliameter requires 10 mA to give a full-scale reading, but the circuit to be measured is only carrying 1 mA. That should send up a warning flag right there! There's no way to get a significant meter deflection without drawing significantly more current than the circuit to be measured carries normally. Placing the voltmeter across the resistor will disturb the normal circuit operation.
This point I understand.
 
  • #10
moenste said:
I don't understand this. What is a series resistance? In the next paragraph you say that the voltmeter is placed in parallel.
The resistor ##R_s## in the figure is in series with the meter movement inside the voltmeter.
We have a resistor, that is denoted V to which a voltmeter is connected in parallel. We also have the voltmeter divided in two parts. By "series resistance" you mean Rs? But in that case it is placed in series with the voltmeter but in parallel to the rest of the circuit.
Yes, exactly.
 
  • Like
Likes   Reactions: moenste
  • #11
gneill said:
Think about what the circuit under test sees when the voltmeter is connected across its resistor. When the voltmeter is placed across the 100 KΩ resistor the circuit "sees" that resistor go from 100 KΩ to less than 15 KΩ (close to 13 KΩ actually). Why might this be a problem for the person testing the circuit?
So, in sum we can say that the current is too low (when 10 mA are required instead of 1 mA) and the resistance of the resistor is too large (100 kOhm instead of less than 15 kOhm). And this will lead to unaccurate test results.
 
  • #12
moenste said:
So, in sum we can say that the current is too low (when 10 mA are required instead of 1 mA) and the resistance of the resistor is too large (100 kOhm instead of 15 less than kOhm). And this will lead to unaccurate test results.
That's it. And to summarize your summary :smile:, low resistance voltmeters place a load on the circuit being measured, disturbing the normal circuit operation thus making readings inaccurate.
 
  • Like
Likes   Reactions: moenste

Similar threads

  • · Replies 1 ·
Replies
1
Views
755
  • · Replies 2 ·
Replies
2
Views
668
Replies
2
Views
3K
Replies
26
Views
5K
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 18 ·
Replies
18
Views
7K
  • · Replies 4 ·
Replies
4
Views
6K