Limiting amps and volts from a battery

AI Thread Summary
To safely power a motor rated at 3 volts and 0.3 amps from a 9.6-volt battery, a voltage regulator, such as the LM317, is recommended instead of using resistors. The motor's inductive nature requires a regulator that can handle inrush current, which may exceed 1 amp at startup. A capacitor can be added to stabilize the output and accommodate current spikes. Using a DC/DC converter is also suggested for better efficiency. Setting the regulator to output between 3 to 3.5 volts will allow the motor to draw the necessary current without additional limiting.
JoeSalerno
Messages
78
Reaction score
2
I have a project that involves a small motor, and the battery powering it would blow the motor in an instant without proper use of resistors. The battery I'm using is rated at 9.6 volts and 1600ma. The motor however, is rated at 3 volts and 0.3 Amps. I would like to go just under these ratings to be safe, so I'd like to provide 2.5 volts and 0.25ma. I've found a formula that's based off of a voltage divider circuit : Resistor 2 = [(Voltage out)(Resistor 1)]/(voltage in - voltage out). Using this formula (assuming my first resistor was 10k Ohms) I got
[(2.5)(10,000)]/(9.6-2.5)= 3,500 Ohms. This should provide the correct amount of resistance to get the voltage down to 2.5 volts. For calculating resistance for amperage, I used Resistance = Voltage/Current, and found 9.6/1600= 10. If the math was correct, and I used the right formula, this should be 10 ohms, but that is obismal and I'm pretty sure that's not right. If anyone knows how to make this circuit in a more efficient manner, or is able to double check that I'm using the right formulas, that would be greatly appreciated. By the way, I know the easiest answer is to just use a smaller battery, but this is part of a larger circuit, so I'm just trying to use the battery that's already in there. Thanks in advance
 
Engineering news on Phys.org
JoeSalerno said:
I would like to go just under these ratings to be safe, so I'd like to provide 2.5 volts and 0.25ma. I've found a formula that's based off of a voltage divider circuit : Resistor 2 = [(Voltage out)(Resistor 1)]/(voltage in - voltage out). Using this formula (assuming my first resistor was 10k Ohms) I got
[(2.5)(10,000)]/(9.6-2.5)= 3,500 Ohms. This should provide the correct amount of resistance to get the voltage down to 2.5 volts. For calculating resistance for amperage, I used Resistance = Voltage/Current, and found 9.6/1600= 10.

this idea is sort of OK if the load ( the motor ) was purely resistive, but it isn't.
It's an inductive load and as such requires a different approach

Getting the 3V for the motor is the easy part ... a voltage regulator eg a LM317 ( adjustable reg) set for ~ 3 to 3.5V

the second bit, the current ... At normal running speed it may well draw only 300mA (0.3A) but at startup that could easily spike to more than 1 Amp and if under load would still require more than 300mA

I have seen some people have issues using linear voltage regulators ( LM317 etc) as motor drivers because they cannot supply that
initial spike in required current. One way around that that does work is to have a reasonable sized electrolytic capacitor on the output of the regulator to supply the current spike.
The other way is to use a switching regulator eg ...

http://www.ebay.com.au/itm/LM2596-Voltage-Regulator-DC-DC-Buck-Converter-Adjustable-Step-Down-Module-/161934554531
cheers
Dave
 
What you really want is a DC/DC converter, if efficiency is of any concern.

Your divider of 10 KΩ -- 3.5 KΩ will certainly give you the voltage that you want, but when you attach it to your motor it will not voltage regulate very well at all. For steady state, consider the effect of the motor as a resistor attached in parallel to the 3.5 KΩ . Also consider inrush current on startup.

JoeSalerno said:
For calculating resistance for amperage, I used Resistance = Voltage/Current, and found 9.6/1600= 10. If the math was correct, and I used the right formula, this should be 10 ohms, but that is obismal and I'm pretty sure that's not right.

9.6 V / 1.6 A is actually 6 Ω.

It is only abysmal in proper context. 100 MΩ is abysmal if it is a load. 0 Ω is a short circuit (not negligible at all--can start fires).

Not really sure what you are trying to do with that result anyhow.

[edit]: davenn did a better job of addressing startup/inrush effects.
 
davenn said:
this idea is sort of OK if the load ( the motor ) was purely resistive, but it isn't.
It's an inductive load and as such requires a different approach

Getting the 3V for the motor is the easy part ... a voltage regulator eg a LM317 ( adjustable reg) set for ~ 3 to 3.5V

the second bit, the current ... At normal running speed it may well draw only 300mA (0.3A) but at startup that could easily spike to more than 1 Amp and if under load would still require more than 300mA

I have seen some people have issues using linear voltage regulators ( LM317 etc) as motor drivers because they cannot supply that
initial spike in required current. One way around that that does work is to have a reasonable sized electrolytic capacitor on the output of the regulator to supply the current spike.
The other way is to use a switching regulator eg ...

http://www.ebay.com.au/itm/LM2596-Voltage-Regulator-DC-DC-Buck-Converter-Adjustable-Step-Down-Module-/161934554531
cheers
Dave
So, to get this straight, I can either hook up a circuit that contains the proper voltage regulator you stated, as well as an appropriate capacitor as a stabilizer. Or I can just use the voltage regulator that you linked because it has all of those components already in it? If I were to use that voltage regulator, it says that it outputs anywhere from 1.25 to 35 volts, and its amperage output is 2-3 amps. To my understanding, volts are "pushed" through a circuit, while amps are drawn in by the loads in the circuit. How would I control how many volts come out of voltage regulator, and would the motor draw a large amount of current initially and then drop to around the 0.3A rating?
 
JoeSalerno said:
So, to get this straight, I can either hook up a circuit that contains the proper voltage regulator you stated, as well as an appropriate capacitor as a stabilizer

yes, you would have to build this one up from scratch

JoeSalerno said:
Or I can just use the voltage regulator that you linked because it has all of those components already in it? If I were to use that voltage regulator, it says that it outputs anywhere from 1.25 to 35 volts, and its amperage output is 2-3 amps.

it is the much easier way to go

JoeSalerno said:
To my understanding, volts are "pushed" through a circuit, while amps are drawn in by the loads in the circuit. How would I control how many volts come out of voltage regulator, and would the motor draw a large amount of current initially and then drop to around the 0.3A rating?

as long as you set the regulator to output 3 - 3.5V, the current drawn will be what is required ... you don't need to do any current limitingDave
 
JoeSalerno said:
How would I control how many volts come out of voltage regulator
The regulator in the link looks like it has a big blue potentiometer for setting the output to a specific value.
 
davenn said:
yes, you would have to build this one up from scratch
it is the much easier way to go
as long as you set the regulator to output 3 - 3.5V, the current drawn will be what is required ... you don't need to do any current limitingDave
Thank you very much for your help. I would've never known to use components other than resistors. Thanks again
 
  • Like
Likes davenn
no problems :smile:

report back when you have it all running
or if you have any other questions
 
Back
Top