Capacitor behavior over frequency

AI Thread Summary
Capacitors behave as shorts to high frequencies and open circuits at DC due to the time it takes for charge movement between their plates. At low frequencies, the charge can fully transfer before the voltage polarity switches, allowing the capacitor to reach a voltage close to the input. Conversely, at high frequencies, the rapid switching prevents sufficient charge movement, resulting in a lower voltage across the capacitor and more voltage across the series resistor. The relationship between resistance, capacitance, and frequency also affects this behavior, as increasing resistance or capacitance lowers the frequency at which noticeable attenuation occurs. The phase shift in capacitors is fixed at -90 degrees, reflecting the inherent time-dependent relationship between current and voltage in AC circuits.
PMASwork
Messages
20
Reaction score
1
Howdy-

Consider a low pass filter subjected to an AC source (i.e the "output" is the capacitor voltage).

I understand mathematically how to assess the frequency reponse of such a circuit.

What I am after is a conceptual description of why. Why do capacitors act as shorts to high frequency and open circuits at DC? I know that the Xc = 1/(jwC) representation shows this mathematically, but I am wondering what is really physically hapening behind the equation.

Applying an AC voltage to a capacitor moves charges from one "plate" of the capacitor to the other, then back. Somehow, if I increase the rate at which I am doing this, the peak-to-peak voltage across the capacitor begins to diminish (and the balance is applied across the series resistor). Why? :confused:

Thanks! :biggrin:
 
Engineering news on Phys.org
Well, a capacitor works, because the electrons in the posive plate are drawn toward the positive terminal of the power supply. The negative terminal of the power supply gives up some of it's electrons to the negative plate. This happens because the power supply try's to equal out it's charge. Okay, now when these electron travel to the negative plate, it causes the plate to be negative. Likes repell. Some of the electrons in the positive plate are repelled by the electrons in the negative plate. They are drawn away from the plate, and run down the wire, through any resistor, and away from the negative charge. This is a current. It's like when you put a charged rod near a metal rod. The negative charge of the rod repells the electrons to the other side of the metal rod. They stay there until the charge is gone. In A.C, the sign wave goes to zero volts. This is like taking away the charged rod. There is no more negative charge to repell the electrons in the positive plate, so the electrons move back to there atoms in the capacitor. Then current is reversed, and the same thing happens to the posive plate. It doesn't, actually, travel through the capacitor. D.C. doesn't fluctuate, so the charge never dissipates, and the electrons in the positve plate stay repelled by the negative plate's charge.
 
PMASwork said:
Howdy-

Consider a low pass filter subjected to an AC source (i.e the "output" is the capacitor voltage).

I understand mathematically how to assess the frequency reponse of such a circuit.

What I am after is a conceptual description of why. Why do capacitors act as shorts to high frequency and open circuits at DC? I know that the Xc = 1/(jwC) representation shows this mathematically, but I am wondering what is really physically hapening behind the equation.

Applying an AC voltage to a capacitor moves charges from one "plate" of the capacitor to the other, then back. Somehow, if I increase the rate at which I am doing this, the peak-to-peak voltage across the capacitor begins to diminish (and the balance is applied across the series resistor). Why? :confused:

Thanks! :biggrin:

I hate "hand waving" answers but here I go:

In a capacitor you have a voltage because there are minus charges on one plate and plus on the other. On AC, you switch the polarity so the minus charges flow to the other plate over and over again.

If you make the switching very fast, then the minus charges can't "settle down" on either plate so they can't create a significant voltage drop in the capacitor.

No voltage drop on the capacitor means all the voltage goes to the resistence.
 
Think of the duration of the waveform of the AC signal. Let's use a simple sine wave, one that is 10Hz so the duration of the waveform is 100ms. If we start at t=0 then at t=25ms we have our peak voltage.

Change to 1000Hz and now the waveform is 1ms. If we start at t=0 then at t=250us we have our peak voltage.

Just examining that first part of the waveform and the charging curve of the capacitor with respect to time, does that make the difference obvious? Where you can see that the short duration will prevent the capacitor from charging too much at all, whereas the long duration (low frequency) gives the capacitor time to charge up to match the voltage.

Cliff
 
Thanks for the inputs!

So, my guesses based on what you guys have posted plus some imagination:

The charges move from one plate to another, and this takes a nonzero time t1. For low frequencies, t1 is much smaller than the switching period, so virtually all the charge that could be displaced for a given Vin and C has done so before Vin switches polarity. Vc=q/C; enough charge has moved in this instance so that Vc = Vin_peak.

However, if the switching frequency increases enough, a significant portion of the maximum possible charges to be displaced may not have made it to the other plate before Vin switches polarity. Vc=q/C, but not all the charge has yet moved to the plate, so Vc < Vin_peak. Thus, gain decreases with frequency.

If I increase the resistance, I decrease the current (i.e. slow the rate at which the charges can move), thus lowering the "corner frequency" at which the circuit will show noticable magnitude attenuation. This makes sense, as fc=1/(2*pi*R*C).

...and, if I increase C, I am increasing the total charge needed to be moved to allow Vc = Vin. More charge to be moved takes more time; this also lowers fc. Again, this makes sense, as fc=1/(2*pi*R*C).

In addition, I'm thinking that this "nonzero charge move time effect" is also what causes the phase shift in a capacitor as frequency increases. But then why would the phase mismatch stop at -90 degrees (at approximately 10*fc), and not just keep decreasing like the gain does?

What do you think? Am I getting close to the truth?

Thanks much!
 
Last edited:
caps

The phase of a capacitor does not change , it is 90 degrees presuming it is an ideal cap.
It's hard to see what a cap does without recognizing that there exists an electric field between the plates and in the case of ac current a magnetic field also.
In Maxwells approach there is a current AC between the plates and the current is said to travel in circular fashion , he called this 'displacement current ' which for a physical medium can be envisaged as a motion of polar molecules , but in Vaccuo is elecromagnetic energy.
Some of this energy escapes a capacitor as radiation which gives it a loss factor which in practice is non zero even if no heating effects are present , it was this ( in his theory) which led to the concept of light and radio waves and the invariant velocity of light --- ALL FROM A HUMBLE CAP.
This is entirely different from the static case, DC, in which only the static electric field between plates matter , but you have to remember that a charged capacitor got there by the movement of electrons so that even there a magnetic field cannot be neglected.
It's simply an observed fact that capacitors and inductors have a relationship which involves time i.e.
I = C.(dv/dt) and V= L.(di/dt)
when applied to sinusoidal stimuli this yields the phase relations which are fixed , however the 'impedance' or magnitude relation between current and voltage is not fixed unless the frequency of the stimulus is also defined so
Z(c) = 1/(j w C ) and Z(L) = j w L where w = 2.pi.f f being the frequency
and j = sqr(-1) defines the phase.
Hope this helps Ray.
 
PMASwork said:
In addition, I'm thinking that this "nonzero charge move time effect" is also what causes the phase shift in a capacitor as frequency increases. But then why would the phase mismatch stop at -90 degrees (at approximately 10*fc), and not just keep decreasing like the gain does?


When I said "...causes the phase shift in the capacitor as frequency increases..." I should have said "...casues the increasing phase lag in the output voltage (as compared to the input voltage) as frequency increases."

Sorry if that caused any confusion!

I still am unsure why this stops at -90 degrees. :confused: Thanks again!
 
For the filter you quoted , as f ---> infinity the input impedance ----> R so that the input voltage current has zero phase, but this current v/R flows through z(c) which gives a voltage phase of 90 wrt. the current ( hence to the input voltage). It cannot go further it represent only what the cap is doing wrt, the current.
 
Back
Top