Let's say that we have two capacitive sources of power: 1) ten parallel 450V 1800uF 138.8mOhm-ESR capacitors 2) one 450V 18000uF 13.88mOhm-ESR capacitor Both sources contain 1,823J of energy. The source containing ten capacitors is advantageous for my purposes due to the fact that the capacitors can be discharged at set intervals, thus creating a longer, lower current pulse than the one large capacitor. My main priority is overall system efficiency. The disadvantage with lower-capacitance capacitors is that they have a much higher equivalent series resistance (ESR.) Being that resistance in a circiut dissipates power, thus wasting energy, I would expect, in my mediocre electrical knowledge, that the source containing ten high-ESR capcitors would waste more energy than the other, thus leading to lower overall system efficiency. However, my source of confusion is the fact that if all ten capacitors from the first source were placed in parallel and discharged simultaneously, the total resistance would be given by the equation below, which dictates total resistance of resistors in parallel, in this case, the ESR of the ten capacitors: By this equation, the total ESR of all ten capacitors connected in parallel would be 13.88mOhms, the same ESR as the one large capacitor. Is it true that if all ten capacitors from source one were discharged not simultaneously, but in very close intervals, the total energy lost due to ohmics losses would be the same as in source two, the one large capacitor? As always, thanks in advance for your help -- it's much appreciated.