1. The problem statement, all variables and given/known data Prove that the total energy initially stored in the electric field inside the capacitor is equal to the total electrical energy eventually dissipated by the bulb. A 5 V DC power source charged the capacitor. The capacitor was then connected in series with a light bulb with a resistance of 20 ohms and an ideal ammeter. When connected, the ideal ammeter read the current as 20 A. Twenty seconds later, the ammeter read the current as 0.05 A. The capacitance was calculated to be 1.25 F. 2. Relevant equations V=IR C=Q/V P=IdealtaV = (I^2)(R) U=(1/2)CV^2 3. The attempt at a solution By using U=(1/2)CV^2 and plugging in all the values given above, the total energy initially stored in the electric field inside the capacitor came to be 15.6 J. The power dissipated as heat by the bulb came out to be P=RI^2=(.25A)^2(20 ohms)=1.25 Watts. Converting watts into joules or power into energy, I took 1.25 Watts*25 seconds = 31.25J. I followed and checked the units to be correct and the units came out right. The 25 second is the number when calculated turns out to be the length of time the capacitor completely discharges and the light bulb goes out. So this is only an ideal case. I really don't need any information on non-ideal cases about this. It's just going to confuse me. But I just need help on if my values came out to be right. The energy dissipated as heat by the bulb is way more than the energy provided by the capacitor. I don't see how that is possible.