Charging a 1.2 rechargeable battery

AI Thread Summary
The discussion revolves around the use of a 9V DC output charger for a cordless phone that operates on two 1.2V rechargeable batteries. It is clarified that the charger does not apply 9V directly to the batteries; instead, the internal circuitry steps down the voltage to the required 2.4V for charging. The higher voltage is likely used for cost-effectiveness in mass production, allowing the charger to power various devices efficiently. The device will draw only the necessary power, meaning it will utilize the appropriate voltage for charging despite the higher input. This design approach minimizes waste and enhances compatibility with different devices.
mendes
Messages
39
Reaction score
0
I have a simple question I hope):

My cordless phone handset works with two 1.2V rechargeable batteries, but the DC converter that comes with it and that is supposed to recharge the batteries has a 9V DC output ! How come ? Is it applying 9 V to recharge two 1.2V rechargeable batteries ?! Or perhaps the 9V go only on the charger (where sits the headset to be charged) and the charger applies only the appropriate voltage to the batteries, that is around 2.4 V. But then why so much voltage (9 V) for the charger ?!
 
Last edited:
Engineering news on Phys.org
To power the transceiver functions of the base
 
somecreepyold said:
To power the transceiver functions of the base

The base is a separate unit (from the handset charger) and has a separate power supply.
 
Then its most likely because its cheaper to produce a bazillion 9-volt chargers than it is to make separate ones for each unit.

The internal circuitry will step down the voltage to what is needed and draw only as much power as needed.

Stepping down DC voltage is much much much easier (less waste) than going from AC to DC, hence the preference for mass production of something that can properly power a wide range of devices.
 
you mean the device will draw only 1.2 v even if we supply 9v to it?
 
Last edited by a moderator:
Well a better way to put it would be its going to draw as many watts as it needs to charge
 
Hey guys. I have a question related to electricity and alternating current. Say an alien fictional society developed electricity, and settled on a standard like 73V AC current at 46 Hz. How would appliances be designed, and what impact would the lower frequency and voltage have on transformers, wiring, TVs, computers, LEDs, motors, and heating, assuming the laws of physics and technology are the same as on Earth?
While I was rolling out a shielded cable, a though came to my mind - what happens to the current flow in the cable if there came a short between the wire and the shield in both ends of the cable? For simplicity, lets assume a 1-wire copper wire wrapped in an aluminum shield. The wire and the shield has the same cross section area. There are insulating material between them, and in both ends there is a short between them. My first thought, the total resistance of the cable would be reduced...
I used to be an HVAC technician. One time I had a service call in which there was no power to the thermostat. The thermostat did not have power because the fuse in the air handler was blown. The fuse in the air handler was blown because there was a low voltage short. The rubber coating on one of the thermostat wires was chewed off by a rodent. The exposed metal in the thermostat wire was touching the metal cabinet of the air handler. This was a low voltage short. This low voltage...

Similar threads

Back
Top