How Does Self-Absorption Affect Intensity in a Spherical Gamma-Ray Source?

  • Thread starter Thread starter physfan01
  • Start date Start date
  • Tags Tags
    Source Spherical
physfan01
Messages
1
Reaction score
0
Encountered the following bonus question in a hwk assignment and neither myself nor my other class-mates have a clue on how to approach it. Or, if someone has ran across this or a similar problem in a text or online, please point me in that direction.
Any help is greatly appreciated.

Given: A gamma-ray source in the shape of a homogeneous sphere of radius R, throughout which there is a homogeneous temperature distribution.

a. Calculate the factor by which the intensity is reduced through self-absorption. Express this as a function of:

1) source radius &
2) source temperature

b. What is the magnitude of these reductions in the case of a 198-Au source (μ = 2.9 cm-1) of radius R = 0.15 cm when the temperature is changed by ΔT = 1000°C (linear coefficient of expansion α = 17 x 10-6 per °C).
 
Physics news on Phys.org
I'd suggest the ring method, but IIRC it will be a complicated integral. I think I did a similar problem about 25+ years ago.

The intensity is simply a function of distance d and the radius, r which will determine the attenuation.

The thermal expansion will simply change the electron density, but I don't think by much.


ΔT = 1000°C (linear coefficient of expansion α = 17 x 10-6 per °C) means a difference of about 2% based on 1000 * 17 x 10-6 = 0.017
 
Back
Top