# Why do short circuits have zero voltage?

• Vellyr

#### Vellyr

I've searched around and all I can find are people quoting the equations. I see it in the math, but it seems contradictory. Here is my line of thought:

1. A source maintains a voltage difference across its terminals.
2. The difference in voltage causes charge to flow from higher to lower potential around the circuit.
3. When you directly connect the battery terminals, the voltage goes to 0 and current goes to infinity. How can the voltage across the terminals drop below the emf, when that is a physical property of the source? Where does the extra voltage, as rated on the battery, "go"? How can current flow with nothing to "push" it?

I guess I'm also not really clear why voltage depends on resistance. I thought it was entirely based on charge density and its arrangement in space.

Instead of regarding a short-circuit as the impossible-to-achieve exactly 0 ohms, choose a more realistic value, only as small as is reasonable. In some situations 10 ohms might be a realistic value, other cases might regard 0.01 ohms as a good value.

It doesn't need to be 0.0000000 ohms, just a low resistance path that causes other circuit/s to be bypassed by most of the current.

OK, but then why is V very small? My intuition says that V across the entire circuit should be constant and equal to the emf (for an ideal battery). When you take a wire and connect the terminals of a battery, why do you only get a tiny voltage? Isn't the voltage determined by the distribution of charge in the battery?

You get a tiny voltage from a battery if you overload it. This causes it to produce internal heat and operate differently to how it was designed.

You can model a practical battery as comprising an ideal constant voltage source together with a resistance. If this battery resistance is 2 ohms it will still allow the battery to power a radio or a small light bulb, but if you hold a screwdriver shaft across the battery terminals, this "short circuit" current will be limited by the battery's 2 ohms. There just won't be a huge current through your low resistance screwdriver, in this example, so there won't be much voltage across it.

• Vellyr
I think I get it. There are no elements in the circuit, so nearly all of the voltage drops across the internal resistance of the battery and there is zero delta-v at any two points along the wire. So say you have a battery with the terminals wired directly to one another. Will a voltmeter attached to this always read 0, or is there any way to measure the battery voltage with it?

A very sensitive meter can read the tiny voltage across any piece of wire you wish to use.

Voltage is analogous to pressure in a pipe. Close a valve in the pipe and "upstream" pressure goes to some maximum value (voltage at infinite resistance). Open the valve and the pressure declines as a function of the flow (flow analogous to current).

The overheating of a battery is due to internal resistance, which has an associated voltage drop. If you include that in the circuit it becomes clearer where the voltage goes.

I just wanted to pitch in with a question:

If you short circuit a battery, the overall resistance of the circuit increases substantially, thus dropping the voltage. As a result does the current increase as to be consistent with the wattage? What's the relationship in the voltage drop over a long distance, i.e. a power line transferring electrical energy to another city?

1. A source maintains a voltage difference across its terminals.
snip
3. When you directly connect the battery terminals, the voltage goes to 0 and current goes to infinity.

1 and 3 are clearly contradictory. It can't both maintain a voltage and go to zero.

The solution is to understand the difference between ideal voltage sources and real world voltage sources. Ideal voltage sources maintain a voltage on their terminals and cannot be short circuited. That's effectively an illegal condition. Just as it is illegal to open circuit an ideal current source (the voltage would rise to infinity).

In the real world voltage sources are not ideal. They might appear to be ideal but will always have some internal resistance even if it's only the resistance of the terminals or the plates it's made from. Let's call that internal resistance Rint. If you short circuit one of these real world voltage sources the internal EMF stays the same, the external voltage falls to zero and the current will be I = V/Rint. If Rint is very small then the current can be very large.

However there is also no such thing as a short circuit in the real world. If we ignore superconductors for the moment, all wires no matter how thick have some resistance. Let's call that Rshort. So the external voltage may appear to be zero but it will actually depend on the relative magnitudes of Rint and Rshort. You can use the potential divider rule to work out what the external voltage will actually be.

If you short circuit a battery, the overall resistance of the circuit increases substantially, thus dropping the voltage.

No that's not the way to think about it. If you short circuit a battery the overall resistance of the circuit REDUCES. Previously there was a high resistance air gap between the terminals, now there is a low resistance wire between the terminals. The reduced resistance causes the current to increase dramatically. The voltage drop across the internal resistance increases from near zero until it more or less equals the EMF.

As a result does the current increase as to be consistent with the wattage?

I don't understand that question.

What's the relationship in the voltage drop over a long distance, i.e. a power line transferring electrical energy to another city?

The voltage drop depends on the current flowing and the resistance of the wire. eg Vdrop = I*R

So if the voltage at the generator is Vgen and at the city = Vcity then

Vcity = Vgen-Vdrop
or
Vcity = Vgen-I*R

PS: Your question isn't connected to short circuits so you should start a new thread if you wish to discuss further.