- #1

ddobre

- 33

- 2

## Homework Statement

You have a voltmeter with an internal resistance of 10 MΩ. You would like to measure a very large voltage source, but you notice that this overwhelms your voltmeter, and you begin to get inaccurate results if the voltage is too high. You design the circuit shown above as a workaround. When measuring a voltage source of 15 kV, you would like your voltmeter to read 50 V. What value of R must you use to achieve this?

## Homework Equations

Vab = (emf) - Ir

V = IR

(Parallel): 1/Req = 1/R1 + 1/R2

(Series): Req = R1 + R2

## The Attempt at a Solution

The first thing I tried was to calculate current with Vab = 50v, emf = 15000v, and r = 10MΩ, to which I got I = .001495A . After calculating this I got a little confused as to which equation to use next because of the location of the voltmeter. I know that in series the current remains the same, but also the voltmeter is kind of like a resistor too, one which is in parallel with the 15MΩ resistor. One attempt of mine was to calculate Req for the parallel situation of the voltmeter and the 15MΩ resistor, with Req = 6000000Ω. From there, I tried setting up a series equation with the Req just calculated as a single resistor in series with the R resistor I am looking for. But this is where I got confused. I tried equating different values using V1/R1 = V2/R2: (50v)/(6000000Ω) = (15000v)/R, getting R = 1800000000Ω. But, I think this is out of proportion with the rest of the circuit. Any advice?