How long can a 35 mAh battery charge a device that requires a 70 mA current?
30 minutes, right?
Yes. I know there's a "but" lurking somewhere in the question --- I'm bored --- I'll bite.
The voltage has to be considered as well.
Many batteries shouldn't be fully discharged. Lead acids are an example.
Batteries typically provide nominal voltage until they die. (It does drop slightly -- usually.)
Each battery type has its quirks.
It varies from technology to technology and vendor to vendor, and even among batteries from a single technology and vendor.
Generally batteries are rated a a discharge rate of 0.1C where C is the Ah rating. For a 35ma hour battery that would be at 3.5 ma. So, your battery may be 35mah if discharged at 3.5ma. But, its capacity may be very different at higher currents. At 75ma, it could be a few minutes, even. Depends on the battery (you gave us no information)
But, that said, it will vary from battery to battery and technology to technology.
Here is an example taken for a 2032 coin cell. Note that, assuming 2.0V is the cutoff, you effectively only get 75ma hours at 80ma, but nearly 200mah at 10ma.
(not that the bottom axis is mah supplied at the different currents, not time to discharge).
The issue that causes this is chemical recombination. Essentially, while draining charge slowly, the battery chemistry sort of replenishes itself. If you drain rapidly then there is no time for the replenishment to occur.
If you just search for "battery discharge curves" for whatever chemistry you care about, you will see many examples.
Voltage is what makes the charge flow. You couldn't directly charge a battery of higher voltage with one of lower voltage, for example.
Even more, the power (or energy) is what needs to be compared, not the amp-hours because if you step-up or step-down the voltage, you change the amperage (amp-hours).
Thanks. Follow up question, all other factors aside, a 1000 mAh battery would last 1/2 has long as a 2000 mAh battery right? The device would still act the same way, not like voltage where it requires a certain amount, correct?
Forgetting that the batteries will behave differently. Forgetting the device may not act the same way with different discharge curves. In other words, very roughly speaking, you are correct. But, it pains me to say that. Batteries and devices don't really act that way. The device would act approximately the same way. The batteries would behave differently. And, you may or may not actually get 1000 or 2000 mAh based on at what currents the batteries were rated for.
Hi, another simple battery question. Say you have something that requires 10 watt hours to charge, and you have a source providing a constant 50 watts. It would take .2 hours to charge, right? I feel like I'm missing a quantity here.
I have no idea what you are asking. A source can't provide a constant power. There is no such thing as a "constant 50 watt source" in the sense of which you are speaking. (that's not saying such a thing can't be built) The load can draw a constant power. And, why watt-hours (energy).
How long does it take a 50 watt load to consume 10 watt-hours of energy? 0.2 hours. If you want to twist that around as an answer to your question, then that's your choice.
Separate names with a comma.