Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Why does sound wave frequency not decrease over distance?

  1. Feb 25, 2004 #1
    I've been reading about the physics of sound. Perhaps I'm just reading an overly simplified version of the truth, but it would seem to me that as the particles of the medium sound is travelling through compress and expand, inevitably bumping into each other, that they would loose some kinetic energy in the form of heat and over distance, eventually become less rapid. Obviously, sound doesn't act like this (or does to a small degree, not easily detectable by humans), but it seems to me that it should. The way I view it it seems that a wave should spread out, becomming less focused and quieter over time, as well as having a smaller frequency due to the particles colliding with each other giving up kinetic energy by hitting each other.
     
  2. jcsd
  3. Feb 25, 2004 #2

    selfAdjoint

    User Avatar
    Staff Emeritus
    Gold Member
    Dearly Missed

    The realistic loss of energy you note results in the sound waves becoming less intense, but it doesn't change their frequency. Sound waves, unlike quantum waves, don't have frequency proportional to energy.
     
  4. Feb 25, 2004 #3
    The frequency of sound doesn't decrease with distance, but it does derease with increasing distance, meaing it does go through the doppler shift, same as light. This is why the sound of a car driving by at high speeds drops in frequency as the car passes you.
     
  5. Feb 25, 2004 #4
    A sound wave traveling through a static medium will decrease in both amplitude and frequency over any given distance separate from it's origin.
    Most often, the amplitude of the sound wave is dramatically degraded as opposed to its frequency, yet both do occur.
    With sound waves, it can be looked on as a physical phenomenon of propagated cyclic compression and expansion(even if just once) in the medium of which the sound wave travels through.
    As such, internal medium resistance to stasis fluctuations accounts for the decrease in amplitude.
    Similarly, frequency, which can be defined here as the rate of change in time from cyclic compression/expansion is also affected, though often not as severe.
     
  6. Feb 25, 2004 #5
    Woah, so i was actually right in assuming that frequency decreases over distance, awesome.
     
  7. Feb 25, 2004 #6

    LURCH

    User Avatar
    Science Advisor

    Nice goin' Waste. I think the inaudibility of the change may also be addressed in your original post. If the decrease in frequency is a result of energy being lost to heat generation, than the drop must be miniscule indeed. After all, how much heat does a soundwave usually generate, anyway?
     
  8. Feb 25, 2004 #7
    Thanks, but I'm sure there must be some other factors making the frequency drop that I don't even know exist. Afterall, I'm only a highschool student who hasn't even taken general physics yet.
     
  9. Feb 26, 2004 #8

    russ_watters

    User Avatar

    Staff: Mentor

    As others said though, the change in frequency due to distance is pretty much insigificant - especially when compared with the change in amplitude with distance - and I can't think of any case where it isn't ignored. If it were significant, think of the effect it would have on a concert: In marching band, we had to follow the director's hands and ignore what other people were playing because the time delay could make the music fall apart. Imagine if the frequency of a vertical row of trumpets varied with distance from the audience: people are capable of hearing extremely small differences in frequency between two notes, down to just a couple of herz.
     
    Last edited: Feb 26, 2004
  10. Feb 26, 2004 #9

    turin

    User Avatar
    Homework Helper

    The spectrum of any sound wave will have a non trivial width. The intensity of the higher frequencies will drop off before the intensity of the lower frequencies (in a lossy medium without resonance). This will result in an overall power decrease, as well as a shift of the peak frequency to a lower frequency. For instance, the erruptions of Pacific volcanoes have been "heard" (@ < 20 Hz) thousands of miles away.
     
  11. Jan 13, 2009 #10
    ok this might sound a little weird, and maybe out-to date, but if we consider the frequency loss due to the distance OUR whole picture of the universe might change. Even Edwin Hubble warned the astronomers that : “the possibility that the red-shift may be due to some other cause, connected with the long time or distance involved in the passage of light from the nebula to observer, should not be prematurely neglected”. I won't go into details since it would take at least a few pages, so let's just keep it simple for this time, so let's talk about the basic wave characteristics.
    It is a fact that : "the light-waves from distant nebulae seem to grow longer in proportion to the distance they have traveled.” … “it seems likely that red-shifts may not be due to an expanding Universe, and much of the speculation on the structure of the universe may require re-examination. /Hubble lectured in 1947./ "
    The provided evidence by the Pioneer 10 Doppler data proves that this effect is not linear, but exponential.
    So it just might happen that we need a "new" or extended physics to explain these phenomena.
    My final conclusion is : that it might be insignificant in a smaller scale (ie sound waves), however in a larger scale like electro-magnetic waves it is possible to prove, but it will require a new approach (with a lot of math possibly :) )
    Anyway it was a good question...
     
  12. Jan 14, 2009 #11

    LURCH

    User Avatar
    Science Advisor

    Wow, this is really blowing the dust off oldie. Pnrbert, there are several theories which explore this possibility. Try doing a search for "tired light" theory. These theories are not generally accepted in the mainstream, and are even considered "crackpot" by some.
     
  13. Oct 16, 2009 #12
    Hi guys.

    As I am no expert in physics, I have nothing to back up my theory except a hunch.

    Would it not be fair to say that frequency is time-tied, and if you want a drop in frequency, you would need a drop in time (time going slower). Interestingly enough, if you move towards the source or away from the source of sound (or the source is moving), you will have increase and drops of audible sound. Like a police siren passing by.

    This is called the doppler effect.:smile:
     
  14. Oct 18, 2009 #13
    yea frequency does decrease over a distance
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?