- #1
cosmogrl
- 27
- 0
In lab, we built a circuit with a 4.7 micro-F capacitor hooked up to a DC source. In parallel to the capacitor was a 10megaOhm resistance voltmeter. We used the voltmeter to find the voltage drop across the resistor and find the RC constant (supposing unknown C). We would turn on the battery, charge up the capacitor, then turn off the battery and take watch/record the values in the voltmeter.
My question has to do with the following: Say we started at 1V and read measurements at 0.9, 0.8, 0.7 V etc, as the voltage decreased. Well, when the capacitor was charged at 1V, when we turned the battery off, the voltage would decrease, then start increasing back to 1, and then start steadily decreasing. I had not expected that increase in the voltage. I though it would decrease smoothly from 1 to 0.9 etc. Nobody could explain the temporary increase to me. Can somebody help me explain why the voltage increased and then decreased? Thanks.
My question has to do with the following: Say we started at 1V and read measurements at 0.9, 0.8, 0.7 V etc, as the voltage decreased. Well, when the capacitor was charged at 1V, when we turned the battery off, the voltage would decrease, then start increasing back to 1, and then start steadily decreasing. I had not expected that increase in the voltage. I though it would decrease smoothly from 1 to 0.9 etc. Nobody could explain the temporary increase to me. Can somebody help me explain why the voltage increased and then decreased? Thanks.