Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

X-ray entrance surface dose and effective dose

  1. Nov 18, 2015 #1
    I have recently had a series of lectures on X-ray physics. I have been quite confused by the concept of effective dose and entrance surface dose.

    I have been told that entrance surface dose varies proportionally to kV squared. I have also been told that as kV increases, effective dose decreases.

    My confusion arises because if surface dose increases, it would seem that effective dose decreases. This is very counter intuitive so I was wondering if someone could let me know if I have misunderstood something?
  2. jcsd
  3. Nov 18, 2015 #2


    User Avatar
    Science Advisor
    Education Advisor

    First let's clarify terms to make sure that we're on the same page. Entrance surface dose doesn't always have a concrete definition, but typically it means the physically absorbed dose to the first ~ 1-2 mm of the patient or a water phantom for a single x-ray field. (A term like "skin dose" has a more precise definition as the dose to the first 70 microns down from the surface.)

    Effective dose is a term that's commonly used in radiation protection and is used to translate a physical absorbed dose from a radiation of a given quality or type to a given tissue or set of tissues in Gy, to a relative "whole body dose" for purposes of estimating the consequences of a given exposure for things such as the increase in cancer risk. It's calculated by multipying the absorbed dose D by a radiation weighting factor (w_R) which accounts for the relative biological effectiveness of a given radiation, and then by a tissue weighting factor (w_T) which accounts for the relative sensitivity of the irradiatied tissue. For x-rays w_R = 1, regardless of energy.

    To say that "entrance dose varies proportionally to kV squared" is a little confusing to me because one needs to define this relative to something. X-ray dose is usually measured in proportion to the dose rate on the surface and then the surface dose is just a factor of how long you leave your beam on for. Pehaps this is to generate an image with equivalent contrast noise ratio? Higher kV would lead to lower contrast, so more surface dose would be required to reduce the noise. It doesn't strike me as an unreasonable statment, just that a little more information is needed.

    The second statment, "as kV increases, effective dose decreases" is technically incorrect. As pointed out above, w_R = 1 for x-rays, regardless of energy. Hence, "effective dose" would be calculated as the same, regardless of the beam spectrum. What the statement may refer to is something called "biologically effective dose" or BED, which is a little more subtle. There is some evidence that as the mean energy of the secondary electrons produced by the x-rays decreases, quantities such as lineal energy (the amount of energy imparted into a microscopic volume over the mean chord length through the volume) increase, leading to increased levels of damage to DNA per given macroscopic absorbed dose. This is generally a small effect (~10% or less depending on specific end points and kVs used). In radiation protection terms, it's considered a wash.
  4. Nov 19, 2015 #3
    Thanks for your reply. I have two additional questions, if that is ok?

    So I have learned that if kV increases, more X-rays will penetrate through the body and thus the to achieve the same receptor dose, entrance surface dose can be decreased. Would I be correct in thinking therefore that receptor dose and contrast are not the same?

    Also the full equation I have been told is surface dose = (kV^2 * current * time) / (distance from receptor)^2

    I see. I thought that higher kV X-rays were more penetrating and so less likely to be absorbed, hence effective dose decreases. I didn't think it had anything to do with secondary electrons? Is this definitley incorrect as my lecturer made this point a few times.

  5. Nov 20, 2015 #4


    User Avatar
    Science Advisor
    Education Advisor

    Okay, so you're looking at surface dose in relation to a detector or receptor that's beyond the body. In that case, yes, it make sense that surface dose is going to increase with kV. Higher energy x-rays will be attenuated less and therefore less surface dose will be required for a given dose level beyond the patient. Receptor dose and contrast are not the same. Contrast is defined as a difference in signal intensity due to the present of an object with different radiological properties than the medium around it.

    Seems reasonably to me. "kV" can be a little tricky though, so that part is likely an approximation or a rule of thumb. Remember that kV is really a kilovoltage spectrum defined by a peak voltage (kVp) and a number of other factors such as filtration and anode properties. Often a spectrum is characterized by it's half value layer. What the means is that I can have two beams that might be called 60 kVp spectra, but they could have somewhat different penetrating properties if one of them has a 1 mm Al filter and the other doesn't.

    The first part is correct. The second part ("dose decreases" - I'll return to the "effective" part in a moment) has to specify in relation to what.

    You should ask your instructor about the ICRP (see ICRP Publication 103 [2007]) definition of "effective dose." My guess is that he or she is using the term in a non-radiation protection context, but anyone who has worked with x-rays should be familiar with this concept.
  6. Nov 20, 2015 #5
    Thanks :)
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook