Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Difference in current to load with series/parallel sources

  1. Apr 28, 2011 #1
    I feel this is a very simple question, but I can't seem to wrap my head around it.

    For example, if you have two 1.5 volt batteries in series and a load of 100 ohms then the total voltage is 3 volts and the current is 30mA because I=V/R. If you have two 1.5 volt batteries in parallel then the total voltage is 1.5 volts and the current is doubled so if you have a 100 ohm load then each battery supplies 15mA and the total current is still 30mA, right? I've read a lot of posts discussing how series increases voltage and parallel increases current, but that's with an open circuit. When you connect a load doesn't it all amount to the same current being delivered in the end despite the difference in voltage? I guess I don't understand the benefit or advantage of the different set ups.
     
  2. jcsd
  3. Apr 28, 2011 #2
    Batteries are constant voltage sources. There's usually no point to hooking them up in parallel, just like you would never hook current sources in series. It just doesn't make sense.

    If you hook the two batteries up in parallel, assuming that their voltages are very closely matched, they will give 1.5V to the load, putting 15mA through it. This current will be divided between the batteries, with each battery providing 7.5mA.

    Resistors are Ohmic devices. Under normal operation, their current is always proportional to their voltage (V = IR), with no exception (barring really extreme circumstances).
     
  4. Apr 28, 2011 #3
    So two 1.5 volt batteries in parallel will supply 15 mA not 30 because the current depends solely on the voltage, if I understand you correctly. So why is it that sometimes car batteries are connected in parallel to deliver larger amounts of current (this is what I've heard people say). Car batteries are genrally 12V and supply DC so why would this supply more current? shouldn't it supply less because with any load, the current is directly proportional to the voltage, so batteries in series should supply more current?
     
  5. Apr 28, 2011 #4
    Not solely on the voltage, on the resistance too.

    The problem of paralleling car batteries or other voltage sources is that if the batteries have slightly different voltages or internal resistances, one battery will supply more current than the other. If you put a very heavy load on a battery so that the internal resistance is limiting the current to the load, then adding a second battery may help. Both batteries should be fairly evenly matched however.

    I have designed high reliability circuits that required 100% redundancy. That meant paralleling 5V regulators. To do this I used 6V regulators and added enough series resistance to the output of the regulators to bring the voltage down to 5V. That wasn't a problem because my current drain was constant.
     
  6. Apr 28, 2011 #5
    Batteries have a maximum amount of current that they can put out without destroying themselves. With batteries connected in parallel, the load current is evenly shared between them, so they can drive a higher load.

    Connecting them in parallel doesn't make the batteries supply more current to the load, it just increases the maximum that they can put out.
    Something's iffy about that setup, unless you left something else out.

    Under normal conditions, with both regs running good, both regs would be sharing the load equally (1/2IL each). When one of the regs fails, the other reg would have to pick up all the slack. With the full IL coming through it, the series resistor would drop 2V instead of just one, dropping the power supply to 4V.

    Doesn't seem like a very good redundancy.
     
  7. Apr 28, 2011 #6
    if you connect a 100ohm load to two 1.5V battery connected in parallel, you can sustain a current of 15mA twice as long as if you were to have only one 1.5V battery.
     
  8. Apr 28, 2011 #7
    Yes there is something I left out because I thought it would only confuse the issue. Before I tried this I called the application engineer from the regulator manufacturer and I followed his instructions. He suggested putting a diode in series with the resistor at the output of the regulator in case the regulator failed as a short and to avoid the problem you mentioned. So the diode dropped 0.7V and the resistor only 0.2V. Once he said I should use a diode I knew I had to use a higher voltage regulator. If one regulator were to fail the supply voltage would drop from about 5.1V to about 4.7V.
     
  9. Apr 28, 2011 #8
    +1

    Simple and to the point.
     
  10. Apr 28, 2011 #9
    That makes more sense.
     
  11. Apr 29, 2011 #10
    Thank you Jiggy-Ninja and everyone else who commented. It all makes a bit more sense now!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Difference in current to load with series/parallel sources
Loading...