Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Car battery amps?

  1. May 4, 2007 #1
    hello, i know car battery amps range from 40 amps to whatever, i just cant understand them, the amps aren't printed on the batteries and i don't know why? i have a pretty good ideal that my car battery maybe 45 amps, what does this mean? i know the alternator produces 55 amps, if i hooked a 100 watt light bulb up to my 45 amp car battery i know that the light bulb uses 100w/12v = 8.3 amps, so how do i figure how long the 45 amp battery will run a 8 amp bulb?:confused:
     
  2. jcsd
  3. May 4, 2007 #2
    i was reading something on the web and was wondering if this could be right, knowing the battery is 45 amps, and i am using around an 8 amp bulb, i would assume 8 amps is how much the bulb uses per hour??? if so, 8/45 = around 5.6 hours? could this be right?
     
  4. May 4, 2007 #3
    i think you can do it this way to? take the 45 amps * 12v to get 540 watts, so we know the bulb is 100 watts, so i guess the bulb uses 100 watts per hour, which means it would last, estimating in my head, looking at the battery which is 540 watts, id say it would run little over 5 hours? does it sound like i'm on the right track?
     
  5. May 4, 2007 #4

    Danger

    User Avatar
    Gold Member

    Welcome to PF, Mountain. There must be a misunderstanding somewhere. If you have a 45 amp battery, you sure as hell aren't driving anywhere. The minimum cold cranking amps for a car battery is probably around 350. My Roadrunner draws 1,000 amps in sub-freezing temperatures.
     
  6. May 4, 2007 #5
    Yes 45 amps sounds way too low. Automotive batteries supply much more current - the one I keep around the house is 650CCA (550 otherwise) but is slightly larger than what you would see in a car.

    Anyway, some batteries will give a rating of the full charge in Amp-hours, usually in the form of "y amps for x minutes" written somewhere on the identification label. The product of the current supplied and the discharge time is relatively constant, so if you know how many amps your lightbulb will pull then you can guess roughly how long it will take to discharge the battery.
     
    Last edited by a moderator: May 5, 2007
  7. May 5, 2007 #6
    cold cranking amps are different from amp-hours which is what you want, cca is how many amps can be delivered short term during cold weather, amp-hours is how many amps can be continuously be delivered during one hour. so if you have 45 amps-hour then you get 540 watts that can be delivered continuous for an hour, the cca number will be higher, say 120 amps that can be delivered to the starter instantaneously during winter conditions.
     
  8. May 5, 2007 #7
    my friend on msn is telling me that a car battery as way more than 45 amps, and ive seen it here, i apreciate all the help, but would you not agree that a 100 watt household bulb on a 12 volt battery would be using around 8.3 amps? and how long could i exspect this bulb to run? 5.4 hours? if so, would that prove the battery was 45 amps? im sorry, i am fairly new to this and do understand i am lacking some knowledge
     
    Last edited: May 5, 2007
  9. May 5, 2007 #8
    You're confusing the relationship between max output current (rate of change of charge / time) with the amount of charge (Q) the battery has. You can't get the max current (dQ/dt) the battery could ever output based on how long it took for the battery to discharge from loading it with a lightbulb that consumes such little power. The lightbulb will draw only so much current - period, and as long as the battery can deliver 100W then the lightbulb will shine.

    What you *could* do is attach 5 100W lightbulbs in parallel and confirm that they all are bright and observe upon the addition of a 6th or 7th that there would be some diminishing brightness. That would confirm more or less you have a (5 to 6) * 8.3A ~=45A of supply current.
     
    Last edited by a moderator: May 5, 2007
  10. May 5, 2007 #9
    that wouldn't work, the battery would supply more current. the way you measure a car batteries output is by it's voltage level for rough estimates. hook up the 5 light bulbs and put a voltmeter in parallel, watch as the voltage drops from 13.5 volts at no load to 12.5~13 at rated amp-hour load to 11~11.5. that should tell you more or less how much power the battery can supply over a givin time, if you want more amps increase the number of light bulbs. car batteries charge to ~13.5 volts and are considered dead at 11.
     
  11. May 5, 2007 #10
    I'm no expert, but I think this is incorrect. The resistance of the bulb remains the same when it's hooked up to 12V battery, therefore the current it draws should be 0.083 amps (i.e. 1/10th of what it draws at 120V).

    This turns the bulb into a 1watt (12V * 0.083amps) bulb; i.e. its glow will be barely detectable.
     
    Last edited: May 5, 2007
  12. May 5, 2007 #11
    a 60watt 120v bulb draws 7.2watts and is 20 ohms cold off of 12v so a 100watt bulb will draw about 10.2 watts cold.
     
  13. May 5, 2007 #12
    MountainDew, The 45Amp your talking about is a per hour rating, meaning that your battery could give you a 45Amp current for one hour before it dies out.
    About the second part of your question, when you take a 120Watt bulb and connect it to a 12V car battery, you can simply calculate the current drawn by the bulb from the relation P=VI, now since it's a 120Watt bulb and the car battery voltage is 12V the current drawn will be 10Amps, now since your battery can supply 45Amp for one hour then it can Supply 10Amps for 4.5 hours.
     
    Last edited: May 5, 2007
  14. May 5, 2007 #13

    Averagesupernova

    User Avatar
    Science Advisor
    Gold Member

    Wow! Lot's of misinformation here. An incandescent light bulb is not a linear device. The more current that flows through the bulb, the higher the resistance of the bulb gets. Also, why do you assume that no matter what voltage source you hook a bulb to it will automatically dissipate the rated wattage? The 120 volt 120 watt bulb will dissipate 120 watts AT THE VOLTAGE IT WAS DESIGNED TO OPERATE AT! It will NOT draw 10 amps, ever.
     
  15. May 5, 2007 #14
    Averagesupernova, I think most of us assumed an ideal condition, we looked at it as a HW problem rather than the practical case.
     
  16. May 5, 2007 #15

    russ_watters

    User Avatar

    Staff: Mentor

    The "ideal" condition wouldn't be holding the power constant, it would be holding the resistance constant.
     
  17. May 6, 2007 #16
    i googled and learned to use the formula Watts x volts = amps, how are you getting 10 watts from a 100 watt bulb if you don't mind, from what i've read i've learned to do it like this.

    100 watt bulb at 120 volts draws 100/120 = .8 amps
    100 watt bulb at 12 volts draws 100/12 = 8.3 amps

    or you could do watts like this for example:

    .8 amps from 120 volts draws .8 * 120 = 96 watts per hour
    8 amps from 12 volts draws 8 * 12 = around 96 watts per hour

    the cold cranking amps on the battery is labeld 650 CCA or cold cranking amps, i would assume that would mean the battery could take a load of 6 100 watt bulbs for a very short period of time.

    now that the battery is 650 CCA that would mean 650 * 12 = 7800 watt hours, which they are made to take loads like that for small amounts of time? if 45 amp hours was right then 45 * 12 = 540 watt hour, or 45 amp hour, same thing in my book.

    so the question is whats the amp hours of the battery, i read on the internet the other day that the basic car battery holds about 45 amp hours, and i also read somewhere else last night that a basic car battery holds around 60 amp hours or 60 * 12v = 720 watt hours, meaning it could produce 720 watts in one hour and no more. i think marine batteries start at 100 amp hours and go up from there?

    i need to get me a ohm or amp meter and learn to measure the ohms or amps to see more about this. by the way, i like this thread, im learning lots about amps watts and volts.

    if my 650 CCA battery in my truck was 650 amp ours, that would mean it was 7800 watt hours, if the battery produced that over time then that would mean it would run a 100 watt light bulb 7800 watts/100 watts = 78 hours, or 100 watt bulb on 12 volts uses 100/12 = 8.3 amps, so
    650 amps / 8.3 amps = 78 hours, same formula as above giving the same hours, which i dont think 650 CCA is what you go by? funny how that works out.
     
    Last edited: May 6, 2007
  18. May 6, 2007 #17
    i have a 12v battery on the bench and i hooked up a lightbulb to to test :P, cold i got 20 ohms from a 60 watt lightbulb, 60/100 = 0.6, 0.6*20 = 12, back to ohms law, 12 volts / 12 ohms = 1 amp, 1 amp * 12 volts = 12 watts (your right about me getting the wrong number the first time, must have pressed the wrong button lol:), so your 650 amp-hour batt would run a 100 watt light bulb for 65 hours. cca is different then amp-hours, see thread.

    edit: let me add that because the resistance of a 100 watt house light bulb won't change the battery won't be dead for 650 hours, you won't get that 100 watts only 12.
     
    Last edited: May 6, 2007
  19. May 6, 2007 #18

    russ_watters

    User Avatar

    Staff: Mentor

    MountainDew, resistance is a physical property of a resistor, so you need to use v=ir, find the new amperage and then plug that new amperage into p=vi to find the power at the lower voltage. A 100 watt bulb is labeled as such because the manufacturer assumes you wouldn't try to run it at any voltage other than what it was designed for, not because it would dissipate 100 watts at any voltage.

    Though as said above, the resistance will be temperature dependent.

    Also, watts is a rate already - there is no such thing as "watts per hour". You may mean watt-hours, which is watts times hours (not divided by, which is what "per" means)
     
  20. May 6, 2007 #19
    russ_watters, Sorry, I didn't realize that it was a 120V bulb I thought it was a 12V bulb, now if it is a 120Volt bulb then things would be a lot different, first you would use [tex]\frac{P_1}{P_2} = \frac{V_1^2}{V_2^2}[/tex] to get the new power drawn when connected to the 12V source, in this case it would be about 1.2Watt, as you properly realized this is not enough power to light the bulb, but it WILL draw some current anyway, so using the relation V=IR you get the current which will be about 0.1Amp, and since the battery gives away 45Amps for one hour it can give 0.1Amps for 450 Hours.
     
  21. May 7, 2007 #20
    I have an idea that your car battery is capable of producing a heck of a lot more than 45A. When you first turn the key on it is probably producing thousands of amps for less than a second. The first thing you have to do is create a magnetic field in the starter. To electrial engineers this is called inrush, not to be confused with starting current. This will last for less than a second but can be extremely high. Once the magnetic field is created then the normal starting current is maintained. This could be in the area of 500A or higher but I am just guessing.
     
  22. May 7, 2007 #21
    its not the burst of watts for a short time i care about, its the watts over a LONG period of time that the battery is capable of producing i am curious with. So i have another question on this if you guys don't mind. a 100 watt bulb on 120v ac uses .8 amps an hour, so this would mean if i hooked it to a car battery it would .8 * 10 since 1 ac watt = 10 dc watts? or would i pertend and do it this way.

    100 watts * 12vAC = 8.3
    8.3 AC AMps * 10 = 83 DC amps?

    im still confused a little, im not converting ac to dc or dc to ac, and for that i am confused of why dc amps diff from ac amps? THANKS
     
  23. May 7, 2007 #22
    can one person clear this up since everyone has different numbers :rofl:
     
  24. May 7, 2007 #23
    lmao, this is halariouse, i do thank everyone for their help, and i do use this thread for reference. but dang, this is complicated.
     
  25. May 7, 2007 #24

    Averagesupernova

    User Avatar
    Science Advisor
    Gold Member

    I'll try to clear this up once and for all. There is no such thing as amps per hour. Amperes is already a rate. It is a given number of electrons passing one point in one second. The watt rating on a light bulb is the watts is dissipates AT THE VOLTAGE IT WAS DESIGNED TO BE USED AT. That's it. End of story. There is no more.
    -
    A normal resistor has an ohm rating. This is basically the ratio of voltage across the resistor to current passing through the resistor. Double the voltage and you'll double the current. Halve the voltage and you will halve the current. An incandescent light bulb is a bit different though. As the temperature of the filament increases, so does the resistance so the current does not rise linearly with the voltage as in the case of a normal resistor. Where you ever got the notion that dividing the voltage by 10 will multiply the current by 10 in order to keep the wattage the same I have no idea. It is totally incorrect.
    -
    Now for the amp-hours issue. First of all, find a bulb or load or whatever that is designed to be used at the battery voltage. Next, decide what type of battery is best suited for your load. In your case it is apparent that you will have the load on for more than a minute or 2 so go with a battery that has sufficient amp-hours. One amp-hour is 1 amp drawn out of a battery for one hour, or 2 amps for half an hour or half an amp for 2 hours. You get the picture. It's not a perfect graph though, a battery that is rated at 1 amp hour will not likely be able to supply 100 amps for 1/100th of an hour. This is where cold cranking amps comes in. You really don't need to worry about it since your load will be on for longer periods of time.
    -
    Now if this doesn't clear it up for you, don't get discouraged. We're here to help. But please don't think that because you googled and found a couple of formulas that you know how to implement them properly.
     
  26. May 7, 2007 #25

    russ_watters

    User Avatar

    Staff: Mentor

    To amplify a little...
    Just for clarity, do you happen to know how big of a factor that is? Ie, what the resistance is cold vs hot?
    They were [incorrectly, obviously] assuming that a 100w light bulb would dissipate 100w regardless of the voltage across it. Obviously not how a resistor works...
    I'm not sure why it hasn't come up, but there is also the issue of DC vs AC voltage for the light bulb. Regardless of all that, though, the simplest way to deal with this issue would be to use an inverter and power the light bulb the way it wants to be powered: with 120v AC power. Then you can simply ratio the voltage to find the amperage draw on the battery (not including losses in the inverter, of course....). So a 100W light bulb on an inverter attached to a 12V car battery will draw .83 amps from the inverter, which will draw 8.3 amps from the battery.
     
    Last edited: May 7, 2007
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook