1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Wave dispersion and the bandwidth theorem

  1. Apr 14, 2015 #1
    1. The problem statement, all variables and given/known data
    Consider a propagating wave packet with initial length L0.
    Use the bandwidth theorem to show that the minimum range of angular frequencies present in the wave packet is approximately:
    \begin{equation}
    \Delta \omega = \frac{v_{g}}{L_{0}}
    \end{equation}
    where vg is the group velocity.
    2. Relevant equations
    The dispersion relationship for the wave is:
    \begin{equation}
    \omega ^{2} = gk
    \end{equation}

    3. The attempt at a solution
    attached as photo along with original problem sheet. For some reason I get the answer as:
    \begin{equation}
    \Delta \omega = 2\pi \frac{v_{g}}{L_{0}}
    \end{equation}
    see method attached.
     

    Attached Files:

    Last edited: Apr 14, 2015
  2. jcsd
  3. Apr 15, 2015 #2

    Delta²

    User Avatar
    Gold Member

    Yes i agree with your result. but how exactly have you been taught the bandwidth theorem? I just know it as [itex]\Delta k\Delta x\approx 2\pi[/itex] where [itex]\Delta k , \Delta x [/itex] are defined properly.
     
    Last edited: Apr 15, 2015
  4. Apr 15, 2015 #3
    Well, yeah we defined it as \begin{equation} \Delta k \Delta x = 2\pi \end{equation} and then the rest can be derived from there.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Wave dispersion and the bandwidth theorem
  1. Dispersion of Waves (Replies: 3)

Loading...