Register to reply 
What prevents abrupt voltage change in a capacitor? 
Share this thread: 
#1
Aug3014, 04:10 PM

P: 20

Let's say we have a simple circuit consisting of a power supply and a resistor, and currently the input voltage is 0V. We now apply a voltage of 5V to the circuit (like a step increase  instantaneously). The voltage across the resistor changes instantaneously to 5V.
If a capacitor is introduced into this circuit, it will gradually charge until the the voltage across it is also 5V, and the current in this circuit will become zero. My question: What is now preventing us from suddenly changing the voltage from 5V to let's say 10V (again like a step increase  instantaneously)? We could do it before the capacitor was introduced, but why not now? The answer I have thus far always gotten is that for that to happen, the current flowing in the circuit must be infinite, and since that cannot happen, the voltage cannot be changed instantaneously. The problem is that we ARE changing the voltage of the power source instantaneously, just like before the capacitor was introduced. The introduction of the capacitor has not somehow taken away our ability to change the voltage of the power source, has it??? If we continue according to my understanding (i.e. that we can change the voltage instantaneously), that will mean that the current in the circuit will shoot up as if going to infinity, but that clearly cannot ever happen in ANY circuit, so doesn't that violate i=C*(dv/dt)? Don't we change voltage first, and then happens whatever happens? How can anything restrict an 'external variable'? 


#2
Aug3014, 04:26 PM

P: 86

This question had be asked previously and some good answers were provided. Feel free to check it out :)
http://forum.allaboutcircuits.com/th...voltage.27121/ NOTE: I'm getting my forums mixed up, but regardless, good information to be found in the thread I linked! 


#3
Aug3114, 04:42 AM

P: 20

I had already read that whole thread before posting here. It doesn't answer my question.



#4
Aug3114, 06:03 AM

HW Helper
Thanks
P: 5,496

What prevents abrupt voltage change in a capacitor?
The introduction of the capacitor has changed nothing. What is in blue remains as true as it was before. 


#5
Aug3114, 06:36 AM

P: 217

You are mixing up real world and idealized world ideas.
In an idealized world, the ideal power supply could provide infinite current to make the capacitor voltage step. In the real world, power supply current can not be infinite, therefore capacitor voltage and power supply voltage can not jump instantaneously. In the real world there are also stray resistances, capacitances and inductances in the wiring and power supply to make thinks messier. 


#6
Aug3114, 06:36 AM

P: 20




#7
Aug3114, 07:02 AM

P: 217

The answers have been provided to you very clearly multiple times. It is time for you to try to understand what you have been told rather than posting more questions.



#8
Aug3114, 07:28 AM

HW Helper
Thanks
P: 5,496




#9
Aug3114, 07:40 AM

P: 20




#10
Aug3114, 07:45 AM

P: 20

So since you said 'We can set the voltage source to give a step change in its output' what will happen to the current? By definition, it must go to infinity. 


#11
Aug3114, 07:53 AM

HW Helper
Thanks
P: 5,496




#12
Aug3114, 08:25 AM

P: 20

Then what is?



#13
Aug3114, 08:51 AM

HW Helper
Thanks
P: 5,496

Neither of these can cause a step change of voltage across an ideal capacitance. 


#14
Aug3114, 09:07 AM

P: 20

My question was simple enough: "What is now preventing us from suddenly changing the voltage from 5V to let's say 10V (again like a step increase  instantaneously)? We could do it before the capacitor was introduced, but why not now?" "If we continue according to my understanding (i.e. that we can change the voltage instantaneously), that will mean that the current in the circuit will shoot up as if going to infinity, but that clearly cannot ever happen in ANY circuit, so doesn't that violate i=C*(dv/dt)?" Please don't answer if you don't want to answer completely. 


#15
Aug3114, 09:26 AM

Sci Advisor
PF Gold
P: 3,759

You already know the answer.
As others have pointed out, current flows in proportion to dv/dt. Were So , Calculate how much current that takes. ...... done yet ? That's a lot of current, eh? That's why you get a spark when you connect a discharged capacitor to a power supply. So get real and assume your voltage source is capable of only finite amps. Then calculate how much dv/dt it could impose on a capacitor by supplying that current . It's that simple. Look at this: Much of learning is finding out what we already know. 


#16
Aug3114, 04:10 PM

Sci Advisor
Thanks
PF Gold
P: 12,269

The circuit you should be considering is an ideal Voltage source in series with a resistance and a capacitor. The voltage drop across the series resistor will be proportional to the current through it so the capacitor volts will increase as the resistor volts drop exponentially. Google RC time constant  there is plenty out there so I do not need to draw a picture. 


#17
Sep314, 07:12 AM

P: 60

It is only a theoretical problem since, practically, there is no circuit without any resistance source resistance, cable, contact and other. Indeed, if i=Imax*exp(t/R/C) for the first very short time the current will be Imax= DV/Rand may be close to the source shortcircuit current if R is negligible. After a very long time [t>4/R/C] the current will be negligible.



#18
Sep414, 08:26 AM

P: 1,043

http://www.physicsforums.com/showthread.php?t=254776
I posted last in the above closed thread. I will elaborate if needed. Claude 


Register to reply 
Related Discussions  
What prevents a capacitor from changing voltage abruptly?  Electrical Engineering  24  
Voltage caused by change in capacitor plate seperation  Introductory Physics Homework  2  
Archeomagnetic Jerks & Abrupt Climate Change  Earth  7  
Change in Voltage across a Capacitor?  Classical Physics  5  
Voltage across a capacitor can't change abruptly because..  Electrical Engineering  9 