Changing voltmeter resistor value

Click For Summary
Measuring voltage from a high impedance source with a standard voltmeter (10 mega ohms) can yield inaccurate readings due to loading effects. Simply replacing the internal resistor with a higher value (100 mega ohms) is not feasible, as the meter's circuitry is designed for a specific impedance and altering it could cause inaccuracies. Instead, using a 100 mega ohm resistor in series with the voltmeter can help, but this may reduce accuracy due to voltage division. For precise measurements, a high impedance booster amplifier or specialized high impedance voltmeter is recommended, although these can be expensive. Understanding the source impedance and using proper techniques can improve measurement accuracy without needing to modify the voltmeter.
Idea04
Messages
194
Reaction score
1
I am try to measure the voltage from a high impedance source with a regular voltmeter with an input impedance of 10 mega ohm. But the voltage being measured is inaccurate. Is it possible to change the resistor value inside the voltmeter to 100 mega ohms to get a more accurate reading?
 
Engineering news on Phys.org
No, but if your meter is sensitive enough you can use your 100M resistor as a potnetial divider in series with your 10M meter to increase the impedance presnted to the measurand.

If the division of the voltage to be measured by 10 is too great you will need to use a high impedance booster amplifier.
 
What you measure as 10 Megohms is actually a string of resistors in series, which add up to 10 Megohms. These give the various voltage ranges of the voltmeter.
This is quite a good meter already as many multimeters only have 1 Megohm total resistance.

So, although you probably could redesign the voltmeter to use larger resistors, there would then be possible problems with the rest of the circuitry.

I have seen a 10 Megohm voltmeter hold a static charge after a measurement and this took seconds to dissipate. It would be worse if the reading was on a 100 Megohm meter.

If you just put a large resistor in series with the meter, the accuracy would decrease because the meter would then be reading 1000 volts on a 100 volt scale (for example), so you would lose one decimal place.

If you know the impedance of your source, and this is comparable to that of the meter, you can calculate the true voltage from the meter reading.
 
If you are measuring DC you could just use the 10 meg voltmeter and use a variable voltage source and adjust it until the voltmeter reads zero. One lead of the voltmeter on the + lead of each source with the remaining leads of each source hooked together. Of course, you would need a second meter to read your adjustable source. REALLY high input impedance voltmeters will do this sort of thing internally, all automatic. After all, how can you load a voltage source when the voltage across your meter is zero? Make sense?
 
I know there are voltmeters out there with really high internal resistance for measuring voltages with more accuracy. The problem is they are very expensive. The meter I am using, i believe only has one setting. It only has one setting for DC volts and one for AC volts. When i connect it to the source, it reads a higher voltage and it drops rapidly to a lower value. I don't know the internal resistance of the source since my ohm meter only will read as high as 40M ohm. I was thinks that replacing the 10M ohm resistor with a 100M ohm resistor would solve this problem of and i would be able to read a more accurate voltage.
 
Idea04 said:
I know there are voltmeters out there with really high internal resistance for measuring voltages with more accuracy. The problem is they are very expensive. The meter I am using, i believe only has one setting. It only has one setting for DC volts and one for AC volts. When i connect it to the source, it reads a higher voltage and it drops rapidly to a lower value. I don't know the internal resistance of the source since my ohm meter only will read as high as 40M ohm. I was thinks that replacing the 10M ohm resistor with a 100M ohm resistor would solve this problem of and i would be able to read a more accurate voltage.
You mean breaking open the meter and replacing the resistors? vk6kro already mentioned this, but there are actually several resistors that add up to the 10 Mohm. Usually they are decade resistors, like 9M + 900K + 90K + 9K + 900 + 100 = 10M. The chip itself usually only has one range, so the decade resistors form a voltage divider that attenuates the signal a specific amount (x10, x100, x1000...) into the range of the chip.

That's with manual range though. I don't know how auto-ranging meters do it. Is yours autoranging?

The problem with replacing the resistors with ones 10 bigger is because the A/D converter chip will draw a small amount of current when voltage is applied to it. It's small enough so that even with resistors in the Mohm range, the loading effect is small enough to give an accurate reading. If the resistors are too high though, the loading effect could make the reading wrong.

That's probably why the higher impedance meters are more expensive; the chip itself needs much higher input impedance to.

Your best bet, without getting a new meter, is probably what the other guys suggested: Take your 100 Mohm resistor (make sure it's precision, my multimeter uses 0.5% tolerance, I think) and put it in series with what you're measuring, and multiply what your meter reads by 11 (not 10).
 
if you know the source impedance, you can figure it out. or if another range has a different input impedance, and still enough accuracy and precision, you might figure out something from that.
 
If you are measuring DC voltage, use 100 M resistor and a current range on multimeter.
 
Idea04 said:
I know there are voltmeters out there with really high internal resistance for measuring voltages with more accuracy. The problem is they are very expensive. The meter I am using, i believe only has one setting. It only has one setting for DC volts and one for AC volts. When i connect it to the source, it reads a higher voltage and it drops rapidly to a lower value. I don't know the internal resistance of the source since my ohm meter only will read as high as 40M ohm. I was thinks that replacing the 10M ohm resistor with a 100M ohm resistor would solve this problem of and i would be able to read a more accurate voltage.

This meter only has one DC voltage scale. So, it is a dedicated DC and AC electronic voltmeter.

What is the device you are trying to measure the voltage of ?
What voltage do you read initially, and what voltage does it drop to?

If you are measuring high voltage it is still possible to get electrostatic voltmeters which have very high DC resistance.
As an example (only):
http://cgi.ebay.com.au/RAAF-5kV-ELECTROSTATIC-VOLTMETER-c-1944-/180614032568?pt=AU_Militaria&hash=item2a0d6f6cb8
 
  • #10
I see my post wasn't commented on. Perhaps I need to draw a pic. Didn't think it was that complicated.

Adjust the power supply until voltmeter A reads zero. At this point it should be obvious that no current is being drawn from the super duper high impedance source. Now read voltmeter B, that is the voltage you are looking for.
 

Attachments

  • #11
Thanks for all the input, i do appreciate it. I don't have a variable DC voltage supply to compare the two voltages, I am limited in my electrical supplies. I do like the idea with using the current meter with the resistor to measure voltage since current x resistance equals voltage. And the other option I like is to put the resistor in series with the voltmeter and multiply the voltage that is read by 11 to get the proper voltage reading.
Thanks again for the assistance.
 
  • #12
Just so you know, there is a pretty good bet that the current meter with a series resistor is not likely to yield a higher impedance than a plain old voltmeter. Also, when the current meter is on the lowest range it's internal impedance is the highest of all the ranges on current. Even though your meter has only one setting, it still likely has ranges. Autoranging. So you will be given a current reading, and apply it to the series resistor. BUT, on the lowest range you will have some voltage dropped across the current meter that you have to add in. How will you know how much to add?
-
You still haven't told us what voltage you are trying to measure. AC? DC? What range? Can you use a battery with a pot for the variable source? You can probably get by with one voltmeter in the setup I described. Once you have zero'd the meter, take it out of circuit and measure the variable source.
 
Last edited:

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 17 ·
Replies
17
Views
6K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 31 ·
2
Replies
31
Views
5K
  • · Replies 20 ·
Replies
20
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
6K
  • · Replies 32 ·
2
Replies
32
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K