1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Wavelength and penetration.

  1. Oct 30, 2009 #1


    User Avatar
    Gold Member

    Why does penetration increase with decrease in wavelength?
  2. jcsd
  3. Oct 30, 2009 #2
    For a wavelength of EM radiation to be absorbed it must correspond to an energy level in an atom. Consider the http://en.wikipedia.org/wiki/Hydrogen_atom#Energy_levels".

    At longer wavelengths, there is no corresponding energy level for the EM radiation, thus the wave passes through the body.

    There is a nice chart from www.hyperphysics.com

    Last edited by a moderator: Apr 24, 2017
  4. Nov 1, 2009 #3
    I presume you mean photons? In the x-ray region, x-ray penetration increases with decreasing wavelength due to the energy dependence of the deep-core photoejection cross-section of electrons from deeply bound atomic states, e.g., k-shell.

    cross section =~ Const x Z4/(hν)3

    The minimum cross section is very roughly between 1 and 2 MeV.

    Bob S
    Last edited: Nov 1, 2009
  5. Nov 1, 2009 #4


    User Avatar
    Science Advisor
    Gold Member

    This is not strictly correct. If the absorption of radiation was strictly due to atomic energy levels then the absorption would occur only at very narrow bandwidths, which is contrary to common experience (like the black body radiator for example or microwave heating). The phonons in a material will absorb radiation over a larger bandwidth than the atoms alone.

    As Bob S stated, the penetration of high energy waves actually increases with shortening of wavelength. The absorption properties of a material will greatly vary over the frequency range. In general, I would only say that high energy waves will pass through most objects largely unimpeded. Below x-rays, the absorption becomes widely varying and material dependent. For example, water is transparent in the visible light region (for most purposes let's say) but it is much more highly absorptive in the infrared and microwave region. So for water, over a given bandwidth, the absorption decreases as the wavelength decreases.

    However, let's say we have a material that has a constant conductivity over a given bandwidth. In this case, the absorption of the radiation will increase as the wavelength decreases. This is because the radiation's loss in the material is exponentially dependent on the penetration depth in terms of wavelengths. So, keeping the material's depth the same, the electromagnetic wave will see a depth of increasing number of wavelengths as its own wavelength decreases. So we would expect the attenuation to increase with frequency.

    But again, most materials do not have a consistent effective conductivity across the spectrum. And so we can only characterize small regions of the spectrum in this manner. There is also other effects that can cause deviations, like the eruption of plasmas in metals. A good conductor will not allow radiation to pass through, but at a high enough frequency, the surface of the conductor looks similar to a plasma, and the electrons will not be able to oscillate fast enough to cancel out the incident waves. And thus, the radiation can pass through.
    Last edited by a moderator: Apr 24, 2017
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook