Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Ohms and wattage of resistor

  1. Aug 21, 2009 #1
    I am trying to determine ohms and wattage of R2 on attached schematic.

    The resistor fried so I cant tell anything about it

    This circuit is a recharger for a cordless black and decker drill

    Thanks
     

    Attached Files:

  2. jcsd
  3. Aug 21, 2009 #2

    turin

    User Avatar
    Homework Helper

    I would guess that it's a 1/2-Watt, 10-Ω. That is based on the estimated voltage drop of 2 V and current ~200 mA. You can also make a rough estimate on the voltage across R2 this way: ILED ~ 20 mA, VLED ~ 1 V (if red) ⇒ VR2 ~ 1 V + (20 mA)(100 Ω) = 3 V. Those numbers are just rough estimates for the on-state of the LED. Then, a 10-Ω will give you a rough current output a bit above 100 mA, which in turn gives you a wattage around 1/2-Watt. I might go with a 1-Watt resistor, though. I don't think it's supposed to double as a fuse.
     
  4. Aug 21, 2009 #3

    berkeman

    User Avatar

    Staff: Mentor

    The LED forward voltage drop is usually more like 2V for a red LED, isn't it?
     
  5. Aug 21, 2009 #4

    negitron

    User Avatar
    Science Advisor

    Yes, around 1.9 V is the typical nominal forward voltage for red LEDs. Also, 20 mA is the typical Imax, so the usual conservative design value is closer to 10 mA.
     
  6. Aug 22, 2009 #5

    vk6kro

    User Avatar
    Science Advisor

    If you were designing this from the start, you would have to accept that someone could attach a completely flat battery to it.

    The power supply can deliver about 14.8 volts at 210 mA so to limit the current into a flat battery to 210 mA, you would have to put 70 ohms in series with it. 14.8 volts / 70 ohms = 210 mA.

    Then if you wanted the LED to be fully lit at this charge rate, it would have to have a resistor of about 650 ohms in series with it. (14.8v - 1.8v)/0.02 A . 680 ohms would be OK.

    This LED current is now 20 mA the 70 ohm resistor doesn't have to supply, so it could now have a value of 77 ohms. 14.8 / 77 = 192 mA. This resistor would dissipate 2.84 watts so it should be a 5 watt resistor and 82 ohms is the nearest preferred value.

    So, now it is a safe device, but it will only deliver about 34 mA to a 12 volt battery and 10 mA into a battery that had reached 14 volts. Still, it is a simple charger and an automatic reduction in charging current is a good outcome.
    The LED would be almost completely dimmed when the battery was fully charged, with about 1 mA flowing in it.

    To summarise, I would put a 82 ohm 5 watt resistor in series with the battery and a 680 ohm 0.5 watt resistor in series with a LED across the 82 ohm resistor.
    This would give a short circuit current of about 200 mA into a very flat or faulty battery.
     
    Last edited: Aug 22, 2009
  7. Aug 23, 2009 #6
    The LED requires about 20mA to operate. Therefore with the 100ohm resistor in series the voltage accross this circuit for the LED to light (when the battery is charging) is 3.7V. This means that the voltage accross your burned out resistor while it is supplying current to the battery should be maximum 3.7V. Now the voltage across this resistor will change as the battery goes from discharged to charged. So if we take a fully discharged battery the maiximum voltage across this resistor can only be 3.7V. If we want a charging current of approximately 200ma to a flat battery then R=V/I therefore R=18.5ohms. An 18ohm (1W)resistor is the closest.
    This doesn't protect the system from overcurrent but the battery will have rsistance even when it is flat. Also T1 may go into saturation when the current demand is too high limiting the current.
    It is not ideal but trial and error will also help. Make sure you attach a discharged battery once you have fitted the resistor to see if it burns out again.

    http://www.calibrepower.com" [Broken]
     
    Last edited by a moderator: May 4, 2017
  8. Aug 23, 2009 #7

    vk6kro

    User Avatar
    Science Advisor

    Make sure you attach a discharged battery once you have fitted the resistor to see if it burns out again.

    This drill will have NiCd batteries in it.
    These discharge spontaneously at about 25 % per month. So, in hobby use, they are very likely to run flat before you need to use them.
    So, this charger will have flat batteries to deal with quite often.

    Also, NiCd batteries that have failed often fail in a completely short circuited way. Zero volts and zero ohms.

    NiCd batteries are also very sensitive to overcharging. Give them 100 mA for 2 days and they will probably be destroyed. This charger has no timer or overcharging detector, so it is almost guaranteed to destroy its batteries. Anybody can just put the batteries on charge and forget about them until it is too late.

    A few resistors burning up is trivial because they are cheap, but a 13 volt bank of NiCd batteries in a special holder would probably cost more than the drill was worth.
    I try to avoid blowing up resistors when a bit of simple design can make the device immune to it.

    If it gets a flat battery with 18 ohms in series with it, it will send about 820 mA through the 18 ohm resistor (14.8 / 18 = 822 mA) which will then dissipate about 12 watts. (14.8 * 0.822 = 12.16 watts).

    This is probably what they had before and probably why it blew up. So, why just do it again?

    You can't depend on the transformer saturating. This transformer is rated at 10 watts.
     
  9. Aug 23, 2009 #8

    Redbelly98

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper

    So, how many volts (per cell) is reasonably safe for NiCd's? I ask because I have a couple of NiCd cordless tools, and it's possible to overcharge both. I have a simple modification in mind, using Zener diodes, to limit the charger voltage. I was thinking 1.4V per cell, but am interested in what other's have to say about that.
     
  10. Aug 23, 2009 #9

    vk6kro

    User Avatar
    Science Advisor

    Greetings Redbelly!

    The accepted value of current for a NiCd cell is about 10 % of its amp-hour rating figure.
    I haven't seen any justification for this, but it seems harmless enough. Generally, this puts you in the 100 to 200 mA current range.
    The batteries should NEVER feel hot. If they do, you have probably already done some damage.

    It is most important that you control the current and let the batteries decide their own terminal voltage. You can control the current best if you use a current regulator, but if you have to use a resistor, you should have a supply voltage at least 50% higher than the fully charged voltage of the battery.

    And put a timer on it. If you don't, you will certainly forget and overcharge it.

    Or, you could design it to taper off like I did with this post example. It becomes a trickle charger as the voltage gets higher.

    The voltage will start from zero and rise to about 1.4 volts per cell. This will settle to about 1.3 volts after a short time off charge.

    I have an almost new cordless drill that I have to run via a short cable from a 12 volt battery because, 5 years ago, its charger managed to cook the batteries in it.

    I use NiMH batteries in digital cameras and I destroyed a few before I realized that the camera would stop working before the battery was fully discharged. So, if I gave the battery a full charge according to its amp-hour rating, I was overcharging it.

    So, I have made a new charger using a Picaxe chip which detects when the battery is fully charged and then removes the supply of current. The trick is to turn off the power every minute and wait one second then measure the voltage on the battery. If I get 20 successive identical or lower readings, I consider the battery is full charged.

    The Picaxe 14 chip has a very stable 10 bit A to D converter which allows me to get away with doing it this way.

    This doesn't mean the battery is any good, though. Poor batteries reach this state quite quickly, but discharge equally quickly.
     
  11. Aug 24, 2009 #10
    I had made a few assumptions.
    1. that this was the actual charger supplied by Black and decker.
    2. that the output voltage was 14.8V
    First of all the output of the transformer is 15.8V. Times this by 1.414 for full wave rectification. = about 22Vdc.
    Therefore the battery it is charging is probably a 18V nominal battery. This means it has 15 cells. therefore at full voltage the charger can only charge the batteries at 1.46V (22V/ 15cells. It cannot ever over charge the batteries unless there is a problem with the transformer.
    Also the cables and many other parts will have resistance and volt drop.
    I do agree that a fully discharged battery did fry the resistor but I am sure trial and error will help you out. I still think 18ohms is a good staring point.
    Let us know how you got on please.

    http://www.calibrepower.com" [Broken]
     
    Last edited by a moderator: May 4, 2017
  12. May 2, 2011 #11
    Sorry for resurrecting an old thread.

    I have similar problem as a first poster.

    Power supply for my charger got broken beyond repair as well as R2 resistor so i did some research which was wrong and bought too powerful power supply 15V 1,5A. I did repair battery pack by changing cells so it is new. Basically i need to drop down current alot to fit in 100-200mA range. Does anybody have an idea how to do it. I've calculated that i need 11 Ohm 20W resistor on R2. Is there any other way?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Ohms and wattage of resistor
  1. Resistor Wattage (Replies: 12)

  2. Resistor wattage (Replies: 5)

Loading...