Hello,
I'm currently trying to compare theoretical results with an MCNP simulation. I'm using two discrete sets of data, intensity (probability) and linear attenuation coefficient, both functions of energy, to produce an attenuated energy spectrum after x-rays have passed through a thin layer of lead. I've been running through the calculations and I'm getting a higher average attenuated energy (~74 keV) than initial average energy (~33 keV). My guess is I'm doing something wrong somewhere...