# Help - Amplifier Frequency Voltage relationship

1. Feb 7, 2005

### Simon M

Can someone help me by describing why the output voltage drops away in an amplifier when the frequency is above or below the bandwidth???

2. Feb 8, 2005

### chroot

Staff Emeritus
The easist explanation is that the components of the amplifier have capacitance, and that capacitance takes time to charge and discharge. Essentially, the amplifier is not able to swing its output fast enough to keep up with a high-frequency input.

- Warren

3. Feb 8, 2005

### cyeokpeng

The in-depth explanation is that if you do an impedance analysis of the amplifier, and get the frequency response plot of the amplifier, you have the answer to your question. In amplifiers, there is always the capacitance effect, and capacitance impedance drops as the frequency of the input signal increases. (NB: Z=1/jwC where w=angular frequency of the signal) So the gain of the amplifier in the frequnecy response plot will fall to zero as w increases. The physical explanation of why the capacitance drops as frequency increases has been explained by chroot.

Last edited: Feb 8, 2005
4. Feb 8, 2005

### Averagesupernova

Don't forget inductance. It plays a large part in some amplifiers.

5. Feb 8, 2005

### Simon M

Hey...Thanks.

It was clear that impedance is a main contributing factor that effects voltage output but I was unsure about the relationship between frequency and impedance. Got it now!