What prevents abrupt voltage change in a capacitor?

  1. Let's say we have a simple circuit consisting of a power supply and a resistor, and currently the input voltage is 0V. We now apply a voltage of 5V to the circuit (like a step increase - instantaneously). The voltage across the resistor changes instantaneously to 5V.

    If a capacitor is introduced into this circuit, it will gradually charge until the the voltage across it is also 5V, and the current in this circuit will become zero.

    My question:
    What is now preventing us from suddenly changing the voltage from 5V to let's say 10V (again like a step increase - instantaneously)? We could do it before the capacitor was introduced, but why not now?

    The answer I have thus far always gotten is that for that to happen, the current flowing in the circuit must be infinite, and since that cannot happen, the voltage cannot be changed instantaneously.

    The problem is that we ARE changing the voltage of the power source instantaneously, just like before the capacitor was introduced. The introduction of the capacitor has not somehow taken away our ability to change the voltage of the power source, has it???

    If we continue according to my understanding (i.e. that we can change the voltage instantaneously), that will mean that the current in the circuit will shoot up as if going to infinity, but that clearly cannot ever happen in ANY circuit, so doesn't that violate i=C*(dv/dt)?

    Don't we change voltage first, and then happens whatever happens? How can anything restrict an 'external variable'?
     
  2. jcsd
  3. I had already read that whole thread before posting here. It doesn't answer my question.
     
  4. NascentOxygen

    Staff: Mentor

    Nothing about the voltage source has changed. We can set the voltage source to give a step change in its output for reasonable levels of load current.

    The introduction of the capacitor has changed nothing. What is in blue remains as true as it was before.
     
  5. You are mixing up real world and idealized world ideas.

    In an idealized world, the ideal power supply could provide infinite current to make the capacitor voltage step.

    In the real world, power supply current can not be infinite, therefore capacitor voltage and power supply voltage can not jump instantaneously. In the real world there are also stray resistances, capacitances and inductances in the wiring and power supply to make thinks messier.
     
  6. Then if we can still change the output voltage abruptly, what will happen to the current? Won't it try to go to infinity?

    But what's stopping us from just inputting a step change to the power supply voltage? Whatever happens to the current and the capacitor voltage is just an outcome, right? How can you say "therefore...power supply voltage can not jump instantaneously" - when we can still input a step change to the power supply voltage?
     
    Last edited: Aug 31, 2014
  7. The answers have been provided to you very clearly multiple times. It is time for you to try to understand what you have been told rather than posting more questions.
     
  8. NascentOxygen

    Staff: Mentor

    Infinite current? Capacitor current has to come from the voltage source---it's a series circuit. What I typed in bold .... did it not show up on your screen?
     
  9. I am not posting more questions. I am asking for clarifications in the answers. What's wrong with that?

    I did not ask about stepping up the capacitor voltage. I was clearly referring to changing the input voltage instantaneously.

    Again, I was referring to the input voltage ONLY. What's stopping us from stepping up the input voltage instantaneously? (I know it can't be done in the real world. I'm talking about it being theoretically instantaneous)
     
  10. How is that possible? An instantaneous step input change in voltage CANNOT produce 'reasonable levels of load current'. According to i=C*(dv/dt), it will produce infinite current!

    So since you said 'We can set the voltage source to give a step change in its output' what will happen to the current? By definition, it must go to infinity.
     
  11. NascentOxygen

    Staff: Mentor

    Consequently, a voltage source is not suitable for directly driving a pure capacitance.
     
  12. Then what is?
     
  13. NascentOxygen

    Staff: Mentor

    A current source is good. Or use a voltage source but with some resistance.

    Neither of these can cause a step change of voltage across an ideal capacitance.
     
  14. Please try to ANSWER my original question. And by answer I mean in a comprehensive way so that I can understand. If you need clarification in the question, ask, instead of making useless assumptions about it.

    My question was simple enough:
    "What is now preventing us from suddenly changing the voltage from 5V to let's say 10V (again like a step increase - instantaneously)? We could do it before the capacitor was introduced, but why not now?"
    "If we continue according to my understanding (i.e. that we can change the voltage instantaneously), that will mean that the current in the circuit will shoot up as if going to infinity, but that clearly cannot ever happen in ANY circuit, so doesn't that violate i=C*(dv/dt)?"

    Please don't answer if you don't want to answer completely.
     
  15. jim hardy

    jim hardy 5,360
    Science Advisor
    Gold Member
    2014 Award

    You already know the answer.

    okay so you understand ohm's law, i = cdv/dt.

    As others have pointed out,

    current flows in proportion to dv/dt.
    Were [STRIKE]your current[/STRIKE] (oops i meant to say-->) your power source capable of delivering the requisite current it could cause an instantaneous change of voltage across the capacitor.
    So , Calculate how much current that takes.

    ...... done yet ?

    That's a lot of current, eh?
    That's why you get a spark when you connect a discharged capacitor to a power supply.

    So get real and assume your voltage source is capable of only finite amps. Then calculate how much dv/dt it could impose on a capacitor by supplying that current .

    It's that simple.

    Look at this:
    see ? You already knew it , just doubted yourself.
    Much of learning is finding out what we already know.
     
    Last edited: Aug 31, 2014
    1 person likes this.
  16. sophiecentaur

    sophiecentaur 14,186
    Science Advisor
    Gold Member

    The series resistance of the (real) source will limit the rate that the voltage can change. it imposes a time constant RC on the way the Capacitor Volts follow the Source emf.
    The circuit you should be considering is an ideal Voltage source in series with a resistance and a capacitor. The voltage drop across the series resistor will be proportional to the current through it so the capacitor volts will increase as the resistor volts drop exponentially. Google RC time constant - there is plenty out there so I do not need to draw a picture.
     
  17. It is only a theoretical problem since, practically, there is no circuit without any resistance- source resistance, cable, contact and other. Indeed, if i=Imax*exp(-t/R/C) for the first very short time the current will be Imax= DV/R-and may be close to the source short-circuit current if R is negligible. After a very long time [t>4/R/C] the current will be negligible.
     
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?

0
Draft saved Draft deleted