Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Pulling current for charging and current limiting devices

  1. Sep 13, 2012 #1
    I had a few questions about circuits. I am a mechanical engineer so I don't have very much knowledge in electrical theory, just basic stuff, but I have been working on some basic circuit design and could use some help.

    I have three questions really.

    So I have a 3.7V battery on a device that normally charges with a 5V USB source (I think it draws around 1 amp to charge). There are of course 2 pins on the device to provide power to charge. There are also power out pins for a 3.3V accessory, however I tried to draw current from this and it seems to cap out at around 300mA. I was trying to actually draw a lot more, somewhere around 2 amps, but there must be some sort of current limiter. The voltage dropped to about 0.6V when I did this. Am I correct in assuming something is limiting the current? I am guessing normally if the current was too much for a circuit and wiring, the circuit would burn up, but is it true this particular circuit has some sort of device to limit the current so that is protects the circuit? Wouldn't I normally be able to draw unlimited current until the circuit burns up in a simple circuit.

    Anyway, my question is, how can the device put over 1 amp of current through the charging pins, but it won't let me draw more than 300mA from the other pins? It seems if the charging circuit can handle all of that current, why can't the power output give me the same? Maybe it can in theory, but they limit the current for other reasons, so as to not drain the battery so quickly or something.

    Last question, can I draw power off the charging pins? Couldn't these pins just act as a power source since I am charging with them. I mean normally if I give 5V to 3.7V battery, I will be charging, but what if I just attach an LED or something to those same charging pins. Will the LED light up? Kind of confused on this, how charging is accomplish and if that makes any sense.

    Thanks,
    Nick
     
  2. jcsd
  3. Sep 14, 2012 #2

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    Regarding the output.. It's likely there is a current limiter circuit inside. Some safety standards require one to prevent battery explosions/fire if too much current is drawn (eg short circuit). If the 3,3V is generated using a regulator circuit they almost all have some sort of current limit to protect themselves against damage. It can also be hard to prevent overheating of small portable device with no room for heatsinks and fans etc so current maybe limited for that reason.

    NO! You should NOT normally apply a simple 5V voltage source directly to a 3.7V battery. For example that would probably damage a Lithium battery and would certainly be a safety/fire risk.



    4.2V is the typical maximium safe voltage for a 3.7V lithium but even then you can't just connect it to a voltage source. Many but not all lithium batteries have circuits in them to prevent a safety risk but you should not rely on this. Normally there is a charge control/battery meter circuit between the charging pins and the battery. This circuit both protects the battery and helps provide an accurate battery status meter.

    Unlikely to work as the the battery won't normally be connected directly to those pins. see above.
     
    Last edited by a moderator: Sep 25, 2014
  4. Sep 15, 2012 #3
    Thanks for the response.

    That video was really wild. Thanks for the safety tip. So is their a rule of thumb on what voltage is needed to charge a battery? For instance, my car usually gets up to 14.4 volts, but it's a 12V battery, so that is 2.4V over, whereas your example for the 3.7V battery is only 0.5V over.

    If it charges using 5V USB, does that mean that the voltage is dropped to 4.2V by the time it hits the battery? How is this accomplished, by resistors dropping voltage, or by a transformer or something? Sorry my questions are probably way off and not intelligent.

    Thanks,
    Nick
     
  5. Sep 15, 2012 #4

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    Different battery technologies behave very differently and need different charge stratergies...

    Lead Acid - Generally safe to charge using a constant voltage source (provided it's the right voltage). The battery voltage rises as it charges and that reduces the current so it's self limiting. If the right voltage (called the float voltage) is used the charger can be left connected more or less indefinitly. The float voltage is normally set slightly lower than the 100% charged voltage so that the battery doesn't bubble too much and loose electrolyte too fast. The off-charge voltage can be used to estimate the charge state.

    NiCad/NiMH - Should use a constant current source not a constant voltage source. Charging must be controlled. NiCad batteries are more tollerant to slight overcharge than NiMh. If seriously overchrged the life time is reduced or worse. Fast charging without some method of automatically switching off the charger is dangerous. While charging from a constant current source the voltage rises slowly. As it passes 100% charged the voltage actually falls slightly. Many fast chargers look for this voltage peak and switch off. Care is needed during charger design because the peak voltage changes with cell age and temperature. This means they normally look for the voltage slope levelling off or a negative slope rather than the absolute voltage. Additional safety measures are also applied in case the peak detect circuit fails. For example the charge controller can limit the maximum charge time and battery temperature. There are other slight differences between NiCad and NiMh cells. The voltage vs charge curve is very flat so voltage is not a good measure of the charge state. Temperature and cell age can sometimes cause larger voltage changes than charge state. Fast chargers typically have a microprocessor in them to control all this.

    Lithium ion - These are typically charged in two (or even three) phases. First a constant current phase and then a constant voltage phase. Generally more complicated and dangerous than lead acid or NiCad/NiMh. More here..

    http://en.wikipedia.org/wiki/Lithium-ion_battery#Battery_charging_procedure

    See also safety requirements.

    http://en.wikipedia.org/wiki/Lithium-ion_battery#Safety
     
  6. Sep 15, 2012 #5

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    It won't be a simple resistor. It will be a microprocessor based charge controller.

    Plenty of data sheets on the web.. For example..

    http://ww1.microchip.com/downloads/en/DeviceDoc/25049B.pdf
     
  7. Sep 17, 2012 #6
    Thanks Cwatters, you have been really helpful, so I will shoot a few more questions at you if you don't mind.

    I am still confused on one other major thing. When I have a load connected to a powered circuit, let's just say a light bulb. This bulb just needs to be supplied the correct voltage, and it will then draw the proper amount of current right? I mean aren't loads designed to operate at a specific voltage and then they will draw the proper current after that?

    I say this in the fact that if I have a DC adapter at the right voltage the device is designed for, it could be rated at any amperage over the amps the device needs to operate (I could have a 10A power adapter, but my device only needs 300mA right). The device will always draw the correct current, right? I think what I am saying is true, but please correct me if I am misinformed. Maybe this is true from an electronic device, but not a light bulb.

    Also, based on this I have a follow up question.

    If I give a light bulb half the volts it was designed for, then based on V=IR wouldn't it be able to still light up, it would just draw twice the current, assuming the light bulb could handle that amount of current? Wouldn't the light bulb explode if there was too much current for the element? What if I give the light bulb double the voltage, will it draw less current and not light up? It seems backwards that a bulb would not light up at double voltage and blow up at half voltage, but this is what I am pulling out of the equation. I know I am wrong, just trying to understand why. I am just assuming this based on the idea that the resistance will not change in the light bulb?

    Ok, so how does a load know how much current to draw? It should be solely based on the resistance of the device or element or whatever right? Is this why I can sometimes get away with hooking a 15V adapter to a 12V device? The resistance is still the same, but the current drawn will be slightly less, but enough to still power the device. Now I am just confusing myself. Someone please help with my thought process here.

    Thanks,
    Nick
     
  8. Sep 17, 2012 #7

    NascentOxygen

    User Avatar

    Staff: Mentor

    Yes.
    Yes.
    Yes.
    No. It would draw half the current (based on Ohms Law). At such a low current, it probably would barely glow a dim red.
    If it follows Ohms Law, then at 15V the device will draw 25% more current than when powered at 12V. A 12V light bulb would be expected to glow dazzlingly if supplied 15V, though only for a very short while, before failing. Many electronic gadgets behave nothing like Ohms Law, but are designed to be robust and to tolerate a wide range of voltages. (Some will even draw less current if supplied with a higher voltage, they have been designed to draw only the power they need. This is not only because wasted electricity is wasted money, but inside electronic devices wasted electricity means heat, and excessive heat in a sealed space filled with electronics is not good for reliability or long-life.)
     
    Last edited: Sep 17, 2012
  9. Sep 17, 2012 #8
    Ah crap, I just realized voltage and current are proportional after you said that. For some reason I was thinking inversely proportional, don't ask me why, not enough sleep last night. Ok then all of that makes perfect sense. So my whole blurb about the light bulb and voltage being backwards is the exact opposite, so forget I said that.

    So in theory, if I have very little resistance, couldn't I draw a ridiculous amount of current off any voltage source, even a small potential? Like you have said, typically, a higher voltage induces a higher current due to Ohm's law, assuming constant resistance applied to the voltages, so a low voltage will usually mean lower current, but...

    What if I take a small 3.7V battery, and attach a load with almost no resistance, (maybe a superconductor or something). What is preventing this battery from releasing as much current as say a 240V source with some amount of resistance? Is the resistance in the battery itself, and the current is limited by the chemistry of the anode/cathode reaction? I realize I can't get electrocuted by a small 3.7V battery, because my skin's resistance is too high, however, shouldn't I be able to drain the battery instantly of all it's charge if I have an extremely low resistance conductor to a electrical sink.

    Hopefully this is my last quesiton, lol. Thanks for being patient.
     
  10. Sep 17, 2012 #9

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    Yes. Well at least you could try... See below...

    Correct. Most voltage sources have a small "internal resistance". This resistance will limit how much current will flow even if you short circuit the voltage source. For a 12V car battery the resistance is very very low. A car battery is able to deliver 100's of Amps. The internal resistance of a 12Volts worth of AA batteries (eg zinc carbon) is quite a bit larger and this limits how much current can be drawn to a much lower level.

    For info some power supplies have a current limiting circuit built in. If the current goes above a set level they shut down to protect themselves.
     
  11. Sep 17, 2012 #10

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    Last edited by a moderator: Sep 25, 2014
  12. Sep 17, 2012 #11
    Hah wow that is intense. Internal resistance in a battery makes perfect sense. Now I know not to mess with car batteries lol, but someone did tell me I can grab the positive and negative terminal of a car battery and won't get shocked because of my skin's resistance, even though it can deliver very high currents to spark plugs and what not.

    Thanks for your help man, you have been awesome answering these questions. Now I am on the right track. I should have just paid attention in my electrical class I had to take in school.

    Cheers.
    Nick
     
  13. Sep 19, 2012 #12

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    As I understand it you need a current of about 50mA (0.050A) flowing through you to kill yourself. However the resistance of the human body varies a lot. Google suggests a voltage above about 40V volt (AC) might be enough under the right/rare circumstances.
     
  14. Sep 19, 2012 #13
    I have heard that too. Since very low current can stop the heart which is really weird because most of our products have more current flowing through them that would kill us if the current could get through our skins resistance. I wonder about applying a voltage to an open wound?

    Also I assume when your skin is wet, a 12V battery might give you a shock (since touching a 9V battery to your tongue I can feel) but I have heard something similar of around 40V would be needed to kill you in a rare case. I have been shocked by 120V a couple times and it did not feel very good, but I am still here.
     
  15. Sep 25, 2012 #14
    If I have a power supply that I can adjust to 4.2 volts, can I charge a Lithium battery out of the device by attaching the actual battery leads to my power supply, I just have to be careful I don't bring the battery over 4.2 volts right?

    Also, does how fast I charge it matter? My DC supply is rated to 5 amps, but that doesn't mean the battery will draw the amps it needs, I will need to put a resistor in series with my charging circuit to bring it down to a safe current I'm guessing. Otherwise it might try to pump 5 amps in it if my circuit is low resistance. Any idea what this safe current might be? I read somewhere it should be the mAh of the battery divided by 10. So the iPhone battery is 1900 mAh, so I should charge it at 190 mA. But I'm pretty sure my iPhone charger says 1.1 amps or something so that is weird, maybe it's total BS I don't know.

    Thanks,
    Nick
     
    Last edited by a moderator: Sep 25, 2014
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Pulling current for charging and current limiting devices
Loading...