Changing voltmeter resistor value

In summary, the conversation discusses the possibility of changing the resistor value inside a regular voltmeter with an input impedance of 10 mega ohm to 100 mega ohms in order to get a more accurate voltage reading from a high impedance source. However, it is noted that this may not be a feasible solution as it could cause problems with the rest of the circuitry and decrease accuracy. Other suggestions include using a potential divider or a high impedance booster amplifier. It is also mentioned that there are expensive voltmeters with higher internal resistance for more accurate measurements. The conversation also touches on the concept of using a variable voltage source and a second meter to measure the voltage accurately.
  • #1
Idea04
194
1
I am try to measure the voltage from a high impedance source with a regular voltmeter with an input impedance of 10 mega ohm. But the voltage being measured is inaccurate. Is it possible to change the resistor value inside the voltmeter to 100 mega ohms to get a more accurate reading?
 
Engineering news on Phys.org
  • #2
No, but if your meter is sensitive enough you can use your 100M resistor as a potnetial divider in series with your 10M meter to increase the impedance presnted to the measurand.

If the division of the voltage to be measured by 10 is too great you will need to use a high impedance booster amplifier.
 
  • #3
What you measure as 10 Megohms is actually a string of resistors in series, which add up to 10 Megohms. These give the various voltage ranges of the voltmeter.
This is quite a good meter already as many multimeters only have 1 Megohm total resistance.

So, although you probably could redesign the voltmeter to use larger resistors, there would then be possible problems with the rest of the circuitry.

I have seen a 10 Megohm voltmeter hold a static charge after a measurement and this took seconds to dissipate. It would be worse if the reading was on a 100 Megohm meter.

If you just put a large resistor in series with the meter, the accuracy would decrease because the meter would then be reading 1000 volts on a 100 volt scale (for example), so you would lose one decimal place.

If you know the impedance of your source, and this is comparable to that of the meter, you can calculate the true voltage from the meter reading.
 
  • #4
If you are measuring DC you could just use the 10 meg voltmeter and use a variable voltage source and adjust it until the voltmeter reads zero. One lead of the voltmeter on the + lead of each source with the remaining leads of each source hooked together. Of course, you would need a second meter to read your adjustable source. REALLY high input impedance voltmeters will do this sort of thing internally, all automatic. After all, how can you load a voltage source when the voltage across your meter is zero? Make sense?
 
  • #5
I know there are voltmeters out there with really high internal resistance for measuring voltages with more accuracy. The problem is they are very expensive. The meter I am using, i believe only has one setting. It only has one setting for DC volts and one for AC volts. When i connect it to the source, it reads a higher voltage and it drops rapidly to a lower value. I don't know the internal resistance of the source since my ohm meter only will read as high as 40M ohm. I was thinks that replacing the 10M ohm resistor with a 100M ohm resistor would solve this problem of and i would be able to read a more accurate voltage.
 
  • #6
Idea04 said:
I know there are voltmeters out there with really high internal resistance for measuring voltages with more accuracy. The problem is they are very expensive. The meter I am using, i believe only has one setting. It only has one setting for DC volts and one for AC volts. When i connect it to the source, it reads a higher voltage and it drops rapidly to a lower value. I don't know the internal resistance of the source since my ohm meter only will read as high as 40M ohm. I was thinks that replacing the 10M ohm resistor with a 100M ohm resistor would solve this problem of and i would be able to read a more accurate voltage.
You mean breaking open the meter and replacing the resistors? vk6kro already mentioned this, but there are actually several resistors that add up to the 10 Mohm. Usually they are decade resistors, like 9M + 900K + 90K + 9K + 900 + 100 = 10M. The chip itself usually only has one range, so the decade resistors form a voltage divider that attenuates the signal a specific amount (x10, x100, x1000...) into the range of the chip.

That's with manual range though. I don't know how auto-ranging meters do it. Is yours autoranging?

The problem with replacing the resistors with ones 10 bigger is because the A/D converter chip will draw a small amount of current when voltage is applied to it. It's small enough so that even with resistors in the Mohm range, the loading effect is small enough to give an accurate reading. If the resistors are too high though, the loading effect could make the reading wrong.

That's probably why the higher impedance meters are more expensive; the chip itself needs much higher input impedance to.

Your best bet, without getting a new meter, is probably what the other guys suggested: Take your 100 Mohm resistor (make sure it's precision, my multimeter uses 0.5% tolerance, I think) and put it in series with what you're measuring, and multiply what your meter reads by 11 (not 10).
 
  • #7
if you know the source impedance, you can figure it out. or if another range has a different input impedance, and still enough accuracy and precision, you might figure out something from that.
 
  • #8
If you are measuring DC voltage, use 100 M resistor and a current range on multimeter.
 
  • #9
Idea04 said:
I know there are voltmeters out there with really high internal resistance for measuring voltages with more accuracy. The problem is they are very expensive. The meter I am using, i believe only has one setting. It only has one setting for DC volts and one for AC volts. When i connect it to the source, it reads a higher voltage and it drops rapidly to a lower value. I don't know the internal resistance of the source since my ohm meter only will read as high as 40M ohm. I was thinks that replacing the 10M ohm resistor with a 100M ohm resistor would solve this problem of and i would be able to read a more accurate voltage.

This meter only has one DC voltage scale. So, it is a dedicated DC and AC electronic voltmeter.

What is the device you are trying to measure the voltage of ?
What voltage do you read initially, and what voltage does it drop to?

If you are measuring high voltage it is still possible to get electrostatic voltmeters which have very high DC resistance.
As an example (only):
http://cgi.ebay.com.au/RAAF-5kV-ELECTROSTATIC-VOLTMETER-c-1944-/180614032568?pt=AU_Militaria&hash=item2a0d6f6cb8
 
  • #10
I see my post wasn't commented on. Perhaps I need to draw a pic. Didn't think it was that complicated.

Adjust the power supply until voltmeter A reads zero. At this point it should be obvious that no current is being drawn from the super duper high impedance source. Now read voltmeter B, that is the voltage you are looking for.
 

Attachments

  • voltmeter_setup.bmp
    30.5 KB · Views: 635
  • #11
Thanks for all the input, i do appreciate it. I don't have a variable DC voltage supply to compare the two voltages, I am limited in my electrical supplies. I do like the idea with using the current meter with the resistor to measure voltage since current x resistance equals voltage. And the other option I like is to put the resistor in series with the voltmeter and multiply the voltage that is read by 11 to get the proper voltage reading.
Thanks again for the assistance.
 
  • #12
Just so you know, there is a pretty good bet that the current meter with a series resistor is not likely to yield a higher impedance than a plain old voltmeter. Also, when the current meter is on the lowest range it's internal impedance is the highest of all the ranges on current. Even though your meter has only one setting, it still likely has ranges. Autoranging. So you will be given a current reading, and apply it to the series resistor. BUT, on the lowest range you will have some voltage dropped across the current meter that you have to add in. How will you know how much to add?
-
You still haven't told us what voltage you are trying to measure. AC? DC? What range? Can you use a battery with a pot for the variable source? You can probably get by with one voltmeter in the setup I described. Once you have zero'd the meter, take it out of circuit and measure the variable source.
 
Last edited:

FAQ: Changing voltmeter resistor value

1. How does changing the voltmeter resistor value affect the accuracy of voltage measurements?

Changing the voltmeter resistor value can affect the accuracy of voltage measurements by altering the input impedance of the voltmeter. A higher resistor value will increase the input impedance, resulting in more accurate readings. However, if the resistor value is too high, it can cause loading effects and decrease the accuracy of the measurement.

2. What is the purpose of changing the voltmeter resistor value?

The purpose of changing the voltmeter resistor value is to adjust the input impedance of the voltmeter to match the impedance of the circuit being measured. This ensures that the voltmeter does not alter the circuit or affect the accuracy of the measurement.

3. How do I determine the appropriate voltmeter resistor value to use?

The appropriate voltmeter resistor value can be determined by calculating the input impedance of the voltmeter and comparing it to the impedance of the circuit being measured. The voltmeter resistor value should be equal to or greater than the input impedance of the voltmeter, but not too high to avoid loading effects.

4. Can changing the voltmeter resistor value damage the circuit being measured?

Changing the voltmeter resistor value should not damage the circuit being measured as long as the voltmeter is properly calibrated and the resistor value is within the acceptable range. However, if the resistor value is too high, it can cause loading effects and potentially affect the accuracy of the measurement.

5. How does changing the voltmeter resistor value affect the sensitivity of the voltmeter?

Changing the voltmeter resistor value can affect the sensitivity of the voltmeter by altering the range of voltages that can be accurately measured. A higher resistor value will increase the sensitivity of the voltmeter, allowing for more precise measurements of smaller voltages. However, if the resistor value is too high, it can decrease the sensitivity and limit the range of voltages that can be measured.

Similar threads

Replies
6
Views
1K
Replies
17
Views
4K
Replies
20
Views
2K
Replies
1
Views
315
Replies
1
Views
5K
Replies
12
Views
1K
Replies
11
Views
1K
Replies
6
Views
2K
Replies
4
Views
2K
Back
Top