# The Effects of Resistance on Voltage

Tags:
1. Oct 18, 2016

### Mykhalo P

In an open circuit, a voltmeter in parallel to a battery measures the EMF. Once the circuit is closed, the measured voltage drops due to both the internal and the external resistance. My question is how can the initial resistor not affect the voltage the voltmeter measures?

Because an ideal voltmeter has infinite resistance, how can a voltmeter measure voltage without the electrons from the current? It's like, how can I measure the potential energy of a ball without the ball?

Also, if voltage is like potential energy, then why does potential difference equal to zero after it passes the last resistor, and not when it reaches the end of the wire?

I'm sorry that these questions are basic. I just hope to gain a decent understanding of the material. Thank you to anybody who is so kind as to help me out.

2. Oct 18, 2016

### BvU

Hi Mykhalo,
Not a very clear question. Could you re-phrase ?
It's possible: if you use an adjustable voltage source and let it produce the exact same voltage, a meter in between the two voltage sources should show no current.
Gravitational potential energy ? Measure the height and multiply by $mg$ ?

That's a practical consideration: we assume the wire resistance can be ignored (i.e. set to zero for all practical purposes, so there is no voltage drop over the wire). Doesn't work for very low resistances !

3. Oct 18, 2016

### cnh1995

It can't. Ideal voltmeter is a theoretical thing. Basically ideal things are theoretical. They do not exist. Practically,a voltmeter has (or should have) a "very high" resistance compared to anything in the circuit so that when connected in parallel with a component, it will draw a "negligible" current compared to actual currents in the circuits and hence, will not "disturb" the circuit parameters and it will not "load" the source. This negligible voltmeter current is often assumed to be zero practically, because we can neglect its effect in most of the circuits. In some circuits, the voltmeter resistance needs to be taken into account for extreme accuracy.

4. Oct 18, 2016

### QuantumQuest

Hi Mykhalo P

Have you learned about Ohm's Law? It's all about it. When the circuit is open, what is the resistance of the circuit? Plug it in Ohm's Law and find current. The voltmeter has a very big resistance in practice, such that it draws a small current, the one it needs to function properly. So ,what does this mean, in the context of measuring the voltage across a voltage source, in an open circuit? When you close the circuit, the total resistance changes. What does this imply for the same measurement?

There is no such thing in practice as an ideal voltmeter. In theory, we say infinite resistance but this translates in practice, to a very big resistance. So, there is always some current drawn from the voltmeter, although very small.

Every element of an electrical circuit has some resistance, at least small. This has direct consequences on the voltage drop and the current through the element (again Ohm's Law).

Last edited: Oct 18, 2016
5. Oct 18, 2016

### Bystander

6. Oct 18, 2016

### Aaron Crowl

There seems to be a theme of people asking questions along the lines of "How can this thing I learned about in electricity be ideal? It doesn't make sense!" The simple answer is that nothing is perfect in the real world. You are being taught approximations and it's done for a good reason.

Try this for an exercise. Make up a simple circuit with a 10V source and a 10kOhm resistor. Figure out how much voltage a meter will read on the resistor if it does not have infinite impedance. Try it with a meter that has 1MOhm and 1GOhm internal resistance. Examine the results and see how they differ.

With that same circuit figure out the resistance of the wires that connect the circuit. Make all the wires 14AWG and 10cm long. Solve for the current in the circuit with the wire resistance added. Did the wire resistance make much difference?

Last edited: Oct 18, 2016