Suppose you have a 9 V battery that you connect to a load having a very low resistance (e.g. 0.1 ohm). From Ohm's law, the current would be I = V/R = 90 amps, which seems impossible to obtain from such a battery. If we suppose that the load will not burn, which of these options is the correct one? 1) The battery has a maximum power it can provide. For example, if this power is P = 100 W, then since P = RI^2 the current will be I = (P/R)^0.5 = 31.6 amps and the voltage V = RI = 3.16 V. 2) The battery has a maximum current it can provide. For example, if this current is I = 5 A, then V = RI = 0.5 V. I am aware the values given for P and I might not be realist. I am just interested in the general behaviour of a battery (is current, power or voltage the fixed value?). Thanks for your help. Bakshi
just to be clear about this, as the output current of the battery increases, the voltage at the terminals drop. although, if you were to plot terminal voltage vs. output current for a particular physical battery, the curve would not be perfectly linear (which it would be if it were a simple internal resistance), for small enough output currents, the curve does look like a straight line. if, then, you were to run with that ideal voltage in series with simple internal resistance, it is pretty easy to show that the load to connect the battery to that extracts the most power, is one where the load resistance is equal to the internal resistance. then the terminal voltage is half of its maximum (open circuit) voltage and it output current is half of its maximum (short circuit) current.
Batteries have not only a voltage rating, but an "amp" capacity rating. Dry cells, such as common flashlight batteries(or standard 9-volt batteries you are referring to) have a very low amp capacity. Wet cells, such as a 12-volt lead-acid car battery have very high amp capacities.