# Help me understand how a battery charger works

## Main Question or Discussion Point

I'm thinking about all types of charging (LiIon, lead acid, etc.). I get the basics.....but would really enjoy more info on the subject.

Also....one thing that confuses me is how you can "overcharge" a battery...and how you can supply too much current (amps) to the battery. Since its generally said that current is DRAWN based on need, why does the battery itself not dictate the current requirement? My assumption has always been that the charger itself both supplies and draws current at the same time? Still, more info would be helpful.

I bought a charger for my phone. It supplies 4.75v and a max of 2 amps. I am always wondering if that means it IS pushing 9.5 watts into the battery.....or if that means IF the battery calls for it. I ask because the "stock" charger for this phone battery is 5v and 1.0 amp, thus 5 watts. Does this mean I'm actually pumping 9.5 watts in or does it mean that the battery COULD draw that much, but probably isn't? Can anyone help me get the process of charging a battery clear in my head?

Thanks!!

Last edited:

Related Electrical Engineering News on Phys.org
The battery simply obeys Ohm's law in that the current is dependent on the difference between the charging voltage and the battery voltage divided by the battery's (and charger's) internal resistance. As a battery charges, its voltage rises and its internal resistance drops changing the amount of charging current. There are many ways of charging a battery. There are constant voltage chargers and also constant current chargers among others.

The charger at 4.5V and 2A may charge your battery faster, at least until it gets close to full charge depending upon how much charging current the battery can accept. Limiting the voltage to 4.5V may also help prevent overcharging. No, those values do not mean you are pumping that much power into your battery.

I'm not an expert on batteries and it's been decades since I've done any research on it, but from what I remember, one of the things that happens when some batteries reach full charge is that oxygen bubbles form on the anode which increases the pressure inside the battery. Since oxygen doesn't conduct, the bubbles reduce the surface area of the anode available for conduction so the internal resistance goes up and if charging is continued at the same current, the temperature of the battery rises substantially.

Both of these conditions are difficult for the charger to to accurately measure so it is difficult to determine exactly when the battery is fully charged. One method I've heard about is to put a momentary load across the battery as it is charging and measure the voltage drop of the battery which indicates its internal resistance. When the internal resistance drops to the specified value, the charging is terminated.

I'm sure this explanation doesn't apply to all batteries but it helps to illustrate some of the problems in building a good battery charger.

Last edited: