Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Voltage vs Fractional Resistance

  1. Oct 10, 2013 #1
    Couldn't we just say the voltage could be variable, hold that as a constant, and design circuits which could run off completely safe 12v systems? Why not use fractional resistors to get the amperage to an arbitrary value?

    In other words could you just make a resistor valued at, say, 0.10 ohms, to generate the same amperage as a 1 ohm system running at 10 times the voltage? So why even play with higher voltages?
     
  2. jcsd
  3. Oct 10, 2013 #2

    berkeman

    User Avatar

    Staff: Mentor

    You generally use higher voltages to support higher powers, since the currents at low voltages can get large enough to require thick conductors (higher cost, weight, etc.).

    Do you have specific examples in mind?
     
  4. Oct 10, 2013 #3

    meBigGuy

    User Avatar
    Gold Member

    You don't exactly "generate" amperage in the sense you speak of it. But you really need to think in terms of power delivered, which is usually what you are after in the long run.

    Say we had a 1K load and it requires 100ma (that's 10 watts) to do what we wanted it to do. We need 100V to get that current. As you said, If the resistence was 100 ohms, we would only need 10V to get 100ma, but that is only 1 Watt so the light wouldn't be as bright (or whatever), so we need to go up to 31.6V, which causes 316ma, for 10W.

    If we want 10W with 10V we need to go down to 10ohms which is 1 amp. Now we need 10X bigger conductors or accept more power loss.

    Power loss caused by high currents (proportional to the current squared) is the driving factor behind using 700KV or more in high power transmission lines.

    Does that help?
     
  5. Oct 10, 2013 #4
    What law says there's power loss proportional to current squared?

    No, didn't have specific application in mind.

    Thanks.
     
  6. Oct 10, 2013 #5

    SteamKing

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper

  7. Oct 11, 2013 #6
    Derived from P = V * I = ( I * R ) * I = I^2*R so 10X current is 100X the watts loss in heat...
     
  8. Oct 11, 2013 #7
    That formula is about power as a function of current and resistance. It says nothing about heat loss at higher current.
     
  9. Oct 11, 2013 #8
    Talking about conductors - all of the power lost is heat. So for the same amount of losses - 2 x the current needs 1/4 of the resistance... ~ 4 x the copper.
     
  10. Oct 11, 2013 #9

    meBigGuy

    User Avatar
    Gold Member

    My reference to power lost as the square of current is relative to the system being measured. For example if I have 1 amp at one volt running through a circuit, it is somehow dissipating 1 watt as a system. But lets say the wire is 0.1 ohm and the load is 0.9 ohm. Then 0.1 watts is dissipated in the wire. If the current increases to 2 amps (by whatever method, load reduced to 0.4 or voltage doubled) then 0.4 watts is dissipated in the wire. Period! End of Story!

    The I^2 R dissipation of power is true regardless of what portion of the circuit you look at. Power adds linearly through the system.

    The total system power may be dissipated as heat or any combination of work and heat (or sound, or light, etc). The heat and EM dissipation in wiring is generally assigned to the power loss category unless you are building a heater or an antenna (but that is arbitrary).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Voltage vs Fractional Resistance
  1. Current vs voltage (Replies: 9)

Loading...