Confused about Resistance and Charging

AI Thread Summary
The discussion centers on the confusion surrounding the resistance and charging characteristics of different electrical appliances, particularly phone batteries. It highlights that charging a battery is distinct from running current through a resistor, as the charger regulates the current based on the battery's characteristics. Different battery types, such as NiCd, NiMH, and Li-Ion, have varying tolerances for charging currents and voltages, which affects how they interact with chargers. The conversation also emphasizes that mobile device chargers communicate with the device to adjust the charging current based on the charger's capabilities, leading to potential risks if mismatched. Understanding these dynamics is crucial for safe and effective battery charging.
Thattechyguy
Messages
2
Reaction score
0
I'm confused about the resistances of common electrical appliances. v=ir

So I was reading something about how fast iphones charge with different chargers and it got me thinking about the resistance of things like phone batteries.
So it said that it could charge at 5V and 2 amps, but also at 5V and 1 amp..
So does the resistance change from charger to charger?
Resistance = Voltage/Current right...so is the resistance for one 2.5 and the other 5 ohms?

If that's the case...what would happen if I tried to charge, say, an iphone battery with something that charges a 2.5 ohm load at 1W? Would it depend on the charger connecting the two orr...
I'm confused ._.
 
Engineering news on Phys.org
Charging a battery is way different from running a current through a resistor. You can charge a battery with a nominal voltage of 5V using different approaches, but you must be aware of the characteristics of the battery. Some types can handle a lot of current, some cannot. Some types (NiCd or NiMH) degrade if they are charged when not completely empty, others (Li-Ion) cannot tolerate a full discharge.

The amount of charging current a battery can handle is also different between types, but anyhow - the charger regulates the amount of charging current by regulating the charging voltage.
 
  • Like
Likes Thattechyguy
Svein said:
Charging a battery is way different from running a current through a resistor. You can charge a battery with a nominal voltage of 5V using different approaches, but you must be aware of the characteristics of the battery. Some types can handle a lot of current, some cannot. Some types (NiCd or NiMH) degrade if they are charged when not completely empty, others (Li-Ion) cannot tolerate a full discharge.

The amount of charging current a battery can handle is also different between types, but anyhow - the charger regulates the amount of charging current by regulating the charging voltage.
Thanks, I understand it a lot better now.
I'm still confused about how the two connect though...
For example, could you explain how something that charges a 2.5 ohm load @ 1W would charge a Li-ion battery (i.e. a phone?). Would it still be 1W considering how the iphone charger reacted or is there some other characteristic or ohm amount (I couldn't find it when I googled it)?
 
An alternative view on chargers.

Firstly, you are right to see that the charging current may be determined by the charger rather than the battery itself.

Batteries like NiCd and NiMH are often charged by current controlled chargers - ie the charger applies a limited current to the battery rather than a limited voltage.

Lead acid batteries such as car batteries are often charged by voltage limited chargers. They have a low internal resistance, of the order of an ohm, and can supply a large current, several amps, to a flat battery, but the current falls to a trickle as the voltage of the battery approaches that of the charger.

You seem to understand voltage limited chargers, so how do we make a current limited charger?
One simple method is to use a high voltage supply and include a resistance in series. For example to charge a NiCd cel l with a nominal voltage of 1.2V which can vary from 0V to about 1.5V you could use, say, a 50V supply. If the required charging current were say 0.25A, you could use a 200Ω resistance, so the current would vary from 50/200 = 0.25A when the battery was dead flat, to (50-1.5)/200 = 0.24A when fully charged. The resistance of the NiCd, which is very low (normally much less than 1Ω) will have negligible effect on the current. The cell voltage has only a small effect, as shown in the example.
Of course a simple circuit like that would be inefficient and there are circuits which can control the current in other ways, such as rapid on/off switching.

But the big problem with NiX cells is that they suffer if they are overcharged at too high a current. A charger which takes about 15 hours or more to fully charge a battery can normally be left on indefinitely. If the charging current is greater, then the battery will charge more quickly, but that current will damage the battery if connected after it is fully charged. So fast chargers need a timer or to be monitored. There is also a problem of damage due to overheating the cell if the current is too high.
A good NiX charger is intelligent, times the charge, monitors the temperature and applies a current according to the stage of charging - high to start with and dropping to a low level when fully charged or if the temperature is too high.

Lithium rechargeable batteries often have circuitry embedded which controls the charging current, because the dangers of overcharging are more severe.

Edit after new post: I don't know any details of lithium chargers, but would assume that they may be current limited. So one charger may supply a maximum of 1A and another a max of 2A. I think the actual current at any time would be controlled by the embedded circuit in the battery, but would obviously limited by what the charger can supply.
 
  • Like
Likes Thattechyguy
There seems to be a forum wide misunderstanding of how mobile device wallwarts and internal chargers work together. I've posted this stuff a million times. I need to make a FAQ.

This explanation has nothing to do with how to correctly charge a battery.

1. A mobile device wallwart (I am going to call it that to differentiate between it and the circuits in the device itself) is simply a power supply that powers the charger in the phone. The wallwart is TOTALLY isolated from anything to do with the batteries and how they are charged (other than the power demand from the device). Let's assume we are talking wallwarts with a USB like connector (+5, GND, DP, DN)

2. Mobile device charger circuits (in the device) have many modes and controls that may be affected by what wallwart is connected to them. They may decide to draw 2A from a wallwart that signals it can supply 2A. And only 100ma from a USB port that won't give it 500ma. Or anything in between.

A classic example is the ipad charger. You can't plug an ipad into just any old USB compatible charger and expect it to charge. The ipad looks for specific voltages on the DP and DN pins that signal the capacity of the charger. Other phones have their own proprietary protocols, and the USB specification has its own protocol (based on subthreshold signalling on DP and DN.

Many phones will charge at whatever they like if they see DP and DN are shorted. This can cause low capacity wallwarts to burnout when connected to high demand devices.

Here is an old example of what the wallwarts present to the mobile device
http://datasheets.maximintegrated.com/en/ds/MAX14578AE-MAX14578E.pdf (this is an IC inside the mobile device that looks at charger signals and tell the device cpu what it sees so it can configure the charger properly)
Here is a table of some of the "protocols" (I'm using the word protocol pretty loosely, but in some cases there actually is a pretty complex protocol)
upload_2015-9-18_20-18-40.png


Notice that the APPLE CHARGER has different resistor values to indicate 1A and 0.5A. Notice the DEDICATED CHARGER has pins DP and DN shorted.

If you really want your head to spin, read the USB specification for battery powered devices.
https://en.wikipedia.org/wiki/USB#Charging_ports summarizes it nicely.

SO, in summary, wallwarts "communicate" with the mobile device (in a sometimes very rudimentary way) to indicate their capability and the device adjusts accordingly. But, there is no real standard and it is possible a device might try to draw more than a wallwart can handle, possibly damaging the wallwart and/or the device. That's why I ALWAYS try to use OEM chargers. Especially on larger devices.
 
Last edited:
  • Like
Likes Merlin3189
While I was rolling out a shielded cable, a though came to my mind - what happens to the current flow in the cable if there came a short between the wire and the shield in both ends of the cable? For simplicity, lets assume a 1-wire copper wire wrapped in an aluminum shield. The wire and the shield has the same cross section area. There are insulating material between them, and in both ends there is a short between them. My first thought, the total resistance of the cable would be reduced...
Hey guys. I have a question related to electricity and alternating current. Say an alien fictional society developed electricity, and settled on a standard like 73V AC current at 46 Hz. How would appliances be designed, and what impact would the lower frequency and voltage have on transformers, wiring, TVs, computers, LEDs, motors, and heating, assuming the laws of physics and technology are the same as on Earth?
I used to be an HVAC technician. One time I had a service call in which there was no power to the thermostat. The thermostat did not have power because the fuse in the air handler was blown. The fuse in the air handler was blown because there was a low voltage short. The rubber coating on one of the thermostat wires was chewed off by a rodent. The exposed metal in the thermostat wire was touching the metal cabinet of the air handler. This was a low voltage short. This low voltage...
Back
Top