Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Lattice wave dispersion relation

  1. Feb 11, 2015 #1
    Hi. A very quick question. Why is it impossible for a wave to travel on a linear one-atomic chain if its wavelength equals the lattice constant? I.e. the lattice points vibrate with a wavelength equal to the distance between them? Here's what I mean:
    http://www.lcst-cn.org/Solid%20State%20Physics/Ch42.files/image020.gif [Broken]
    http://www.lcst-cn.org/Solid%20State%20Physics/Ch42.html [Broken]

    The dispersion relation says that the "wave" will have zero frequency if the wavelength equals the lattice constant.

    I can see why it must be so mathematically, but I can't understand intuitively why this must happen.
    Last edited by a moderator: May 7, 2017
  2. jcsd
  3. Feb 12, 2015 #2


    User Avatar
    Science Advisor

    The waves will fulfill the Bragg equation for reflection. Hence you get a (two to be precise) superposition of left and right travelling waves: sin(kx) and cos(kx). One of the two will have its maximum (of the squared function) at the ionic cores, the other one between the cores, so the first one will be energetically lower than the second one. That's the band gap. It also means that there are no energy eigenstates corresponding to travelling solutions ##(\cos(kx)\pm i \sin(kx))\exp(i\omega t)##.
  4. Feb 12, 2015 #3


    User Avatar
    Science Advisor

    as the cos and sin components are not degenerate.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: Lattice wave dispersion relation
  1. Dispersion relation (Replies: 4)

  2. Dispersion Relation (Replies: 4)