# Voltage vs Fractional Resistance

## Main Question or Discussion Point

Couldn't we just say the voltage could be variable, hold that as a constant, and design circuits which could run off completely safe 12v systems? Why not use fractional resistors to get the amperage to an arbitrary value?

In other words could you just make a resistor valued at, say, 0.10 ohms, to generate the same amperage as a 1 ohm system running at 10 times the voltage? So why even play with higher voltages?

Related Electrical Engineering News on Phys.org
berkeman
Mentor
Couldn't we just say the voltage could be variable, hold that as a constant, and design circuits which could run off completely safe 12v systems? Why not use fractional resistors to get the amperage to an arbitrary value?

In other words could you just make a resistor valued at, say, 0.10 ohms, to generate the same amperage as a 1 ohm system running at 10 times the voltage? So why even play with higher voltages?
You generally use higher voltages to support higher powers, since the currents at low voltages can get large enough to require thick conductors (higher cost, weight, etc.).

Do you have specific examples in mind?

meBigGuy
Gold Member
You don't exactly "generate" amperage in the sense you speak of it. But you really need to think in terms of power delivered, which is usually what you are after in the long run.

Say we had a 1K load and it requires 100ma (that's 10 watts) to do what we wanted it to do. We need 100V to get that current. As you said, If the resistence was 100 ohms, we would only need 10V to get 100ma, but that is only 1 Watt so the light wouldn't be as bright (or whatever), so we need to go up to 31.6V, which causes 316ma, for 10W.

If we want 10W with 10V we need to go down to 10ohms which is 1 amp. Now we need 10X bigger conductors or accept more power loss.

Power loss caused by high currents (proportional to the current squared) is the driving factor behind using 700KV or more in high power transmission lines.

Does that help?

What law says there's power loss proportional to current squared?

No, didn't have specific application in mind.

Thanks.

SteamKing
Staff Emeritus
Homework Helper
Derived from P = V * I = ( I * R ) * I = I^2*R so 10X current is 100X the watts loss in heat...

That formula is about power as a function of current and resistance. It says nothing about heat loss at higher current.

Talking about conductors - all of the power lost is heat. So for the same amount of losses - 2 x the current needs 1/4 of the resistance... ~ 4 x the copper.

meBigGuy
Gold Member
My reference to power lost as the square of current is relative to the system being measured. For example if I have 1 amp at one volt running through a circuit, it is somehow dissipating 1 watt as a system. But lets say the wire is 0.1 ohm and the load is 0.9 ohm. Then 0.1 watts is dissipated in the wire. If the current increases to 2 amps (by whatever method, load reduced to 0.4 or voltage doubled) then 0.4 watts is dissipated in the wire. Period! End of Story!

The I^2 R dissipation of power is true regardless of what portion of the circuit you look at. Power adds linearly through the system.

The total system power may be dissipated as heat or any combination of work and heat (or sound, or light, etc). The heat and EM dissipation in wiring is generally assigned to the power loss category unless you are building a heater or an antenna (but that is arbitrary).