Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Help - Amplifier Frequency Voltage relationship

  1. Feb 7, 2005 #1
    Can someone help me by describing why the output voltage drops away in an amplifier when the frequency is above or below the bandwidth???
  2. jcsd
  3. Feb 8, 2005 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    The easist explanation is that the components of the amplifier have capacitance, and that capacitance takes time to charge and discharge. Essentially, the amplifier is not able to swing its output fast enough to keep up with a high-frequency input.

    - Warren
  4. Feb 8, 2005 #3
    The in-depth explanation is that if you do an impedance analysis of the amplifier, and get the frequency response plot of the amplifier, you have the answer to your question. In amplifiers, there is always the capacitance effect, and capacitance impedance drops as the frequency of the input signal increases. (NB: Z=1/jwC where w=angular frequency of the signal) So the gain of the amplifier in the frequnecy response plot will fall to zero as w increases. The physical explanation of why the capacitance drops as frequency increases has been explained by chroot.
    Last edited: Feb 8, 2005
  5. Feb 8, 2005 #4


    User Avatar
    Science Advisor
    Gold Member

    Don't forget inductance. It plays a large part in some amplifiers.
  6. Feb 8, 2005 #5

    It was clear that impedance is a main contributing factor that effects voltage output but I was unsure about the relationship between frequency and impedance. Got it now!
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook