Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Limiting amps and volts from a battery

  1. Mar 28, 2017 #1
    I have a project that involves a small motor, and the battery powering it would blow the motor in an instant without proper use of resistors. The battery I'm using is rated at 9.6 volts and 1600ma. The motor however, is rated at 3 volts and 0.3 Amps. I would like to go just under these ratings to be safe, so I'd like to provide 2.5 volts and 0.25ma. I've found a formula that's based off of a voltage divider circuit : Resistor 2 = [(Voltage out)(Resistor 1)]/(voltage in - voltage out). Using this formula (assuming my first resistor was 10k Ohms) I got
    [(2.5)(10,000)]/(9.6-2.5)= 3,500 Ohms. This should provide the correct amount of resistance to get the voltage down to 2.5 volts. For calculating resistance for amperage, I used Resistance = Voltage/Current, and found 9.6/1600= 10. If the math was correct, and I used the right formula, this should be 10 ohms, but that is obismal and I'm pretty sure that's not right. If anyone knows how to make this circuit in a more efficient manner, or is able to double check that I'm using the right formulas, that would be greatly appreciated. By the way, I know the easiest answer is to just use a smaller battery, but this is part of a larger circuit, so I'm just trying to use the battery that's already in there. Thanks in advance
     
  2. jcsd
  3. Mar 28, 2017 #2

    davenn

    User Avatar
    Science Advisor
    Gold Member

    this idea is sort of OK if the load ( the motor ) was purely resistive, but it isn't.
    It's an inductive load and as such requires a different approach

    Getting the 3V for the motor is the easy part ... a voltage regulator eg a LM317 ( adjustable reg) set for ~ 3 to 3.5V

    the second bit, the current ... At normal running speed it may well draw only 300mA (0.3A) but at startup that could easily spike to more than 1 Amp and if under load would still require more than 300mA

    I have seen some people have issues using linear voltage regulators ( LM317 etc) as motor drivers because they cannot supply that
    initial spike in required current. One way around that that does work is to have a reasonable sized electrolytic capacitor on the output of the regulator to supply the current spike.
    The other way is to use a switching regulator eg ......

    http://www.ebay.com.au/itm/LM2596-V...ter-Adjustable-Step-Down-Module-/161934554531



    cheers
    Dave
     
  4. Mar 28, 2017 #3

    lewando

    User Avatar
    Gold Member

    What you really want is a DC/DC converter, if efficiency is of any concern.

    Your divider of 10 KΩ -- 3.5 KΩ will certainly give you the voltage that you want, but when you attach it to your motor it will not voltage regulate very well at all. For steady state, consider the effect of the motor as a resistor attached in parallel to the 3.5 KΩ . Also consider inrush current on startup.

    9.6 V / 1.6 A is actually 6 Ω.

    It is only abysmal in proper context. 100 MΩ is abysmal if it is a load. 0 Ω is a short circuit (not negligible at all--can start fires).

    Not really sure what you are trying to do with that result anyhow.

    [edit]: davenn did a better job of addressing startup/inrush effects.
     
  5. Mar 28, 2017 #4
    So, to get this straight, I can either hook up a circuit that contains the proper voltage regulator you stated, as well as an appropriate capacitor as a stabilizer. Or I can just use the voltage regulator that you linked because it has all of those components already in it? If I were to use that voltage regulator, it says that it outputs anywhere from 1.25 to 35 volts, and its amperage output is 2-3 amps. To my understanding, volts are "pushed" through a circuit, while amps are drawn in by the loads in the circuit. How would I control how many volts come out of voltage regulator, and would the motor draw a large amount of current initially and then drop to around the 0.3A rating?
     
  6. Mar 28, 2017 #5

    davenn

    User Avatar
    Science Advisor
    Gold Member

    yes, you would have to build this one up from scratch

    it is the much easier way to go

    as long as you set the regulator to output 3 - 3.5V, the current drawn will be what is required ... you don't need to do any current limiting


    Dave
     
  7. Mar 28, 2017 #6

    lewando

    User Avatar
    Gold Member

    The regulator in the link looks like it has a big blue potentiometer for setting the output to a specific value.
     
  8. Mar 28, 2017 #7
    Thank you very much for your help. I would've never known to use components other than resistors. Thanks again
     
  9. Mar 28, 2017 #8

    davenn

    User Avatar
    Science Advisor
    Gold Member

    no problems :smile:

    report back when you have it all running
    or if you have any other questions
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Limiting amps and volts from a battery
  1. DC ~ Watts/Volts/Amps (Replies: 7)

  2. Amp limit of a Battery (Replies: 2)

Loading...