X-ray entrance surface dose and effective dose

  • Thread starter Thread starter BobP
  • Start date Start date
  • Tags Tags
    Surface X-ray
Click For Summary

Discussion Overview

The discussion revolves around the concepts of entrance surface dose and effective dose in the context of X-ray physics. Participants explore the relationships between these doses, particularly how they vary with kilovoltage (kV) settings, and the implications for radiation exposure and imaging quality.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants express confusion about the relationship between entrance surface dose and effective dose, particularly regarding how an increase in entrance surface dose could correlate with a decrease in effective dose.
  • One participant clarifies that entrance surface dose typically refers to the absorbed dose at the surface of the patient or a water phantom, while effective dose is a measure used in radiation protection to estimate cancer risk based on absorbed doses in various tissues.
  • There is a suggestion that the statement "entrance dose varies proportionally to kV squared" needs further context, as X-ray dose is often measured relative to the dose rate on the surface.
  • Another participant mentions that as kV increases, more X-rays penetrate the body, potentially allowing for a decrease in entrance surface dose to achieve the same receptor dose, raising questions about the relationship between receptor dose and image contrast.
  • Some participants discuss the equation for surface dose, noting that it may be an approximation and that kV is influenced by various factors such as filtration and anode properties.
  • There is a debate about the accuracy of the statement that effective dose decreases with increasing kV, with some arguing that this is incorrect and may relate to the concept of biologically effective dose (BED) instead.
  • One participant questions the relevance of secondary electrons in this context, indicating a need for clarification on how they relate to effective dose calculations.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the relationships between entrance surface dose, effective dose, and kV. Multiple competing views and interpretations remain, particularly regarding the definitions and implications of these terms in radiation protection and imaging.

Contextual Notes

Limitations include the need for clearer definitions of terms like entrance surface dose and effective dose, as well as the potential variability in X-ray beam characteristics based on factors such as filtration and anode properties. The discussion also highlights the complexity of interpreting dose relationships in the context of radiation exposure and imaging quality.

BobP
Messages
70
Reaction score
1
I have recently had a series of lectures on X-ray physics. I have been quite confused by the concept of effective dose and entrance surface dose.

I have been told that entrance surface dose varies proportionally to kV squared. I have also been told that as kV increases, effective dose decreases.

My confusion arises because if surface dose increases, it would seem that effective dose decreases. This is very counter intuitive so I was wondering if someone could let me know if I have misunderstood something?
Thanks
 
Biology news on Phys.org
First let's clarify terms to make sure that we're on the same page. Entrance surface dose doesn't always have a concrete definition, but typically it means the physically absorbed dose to the first ~ 1-2 mm of the patient or a water phantom for a single x-ray field. (A term like "skin dose" has a more precise definition as the dose to the first 70 microns down from the surface.)

Effective dose is a term that's commonly used in radiation protection and is used to translate a physical absorbed dose from a radiation of a given quality or type to a given tissue or set of tissues in Gy, to a relative "whole body dose" for purposes of estimating the consequences of a given exposure for things such as the increase in cancer risk. It's calculated by multipying the absorbed dose D by a radiation weighting factor (w_R) which accounts for the relative biological effectiveness of a given radiation, and then by a tissue weighting factor (w_T) which accounts for the relative sensitivity of the irradiatied tissue. For x-rays w_R = 1, regardless of energy.

To say that "entrance dose varies proportionally to kV squared" is a little confusing to me because one needs to define this relative to something. X-ray dose is usually measured in proportion to the dose rate on the surface and then the surface dose is just a factor of how long you leave your beam on for. Pehaps this is to generate an image with equivalent contrast noise ratio? Higher kV would lead to lower contrast, so more surface dose would be required to reduce the noise. It doesn't strike me as an unreasonable statement, just that a little more information is needed.

The second statement, "as kV increases, effective dose decreases" is technically incorrect. As pointed out above, w_R = 1 for x-rays, regardless of energy. Hence, "effective dose" would be calculated as the same, regardless of the beam spectrum. What the statement may refer to is something called "biologically effective dose" or BED, which is a little more subtle. There is some evidence that as the mean energy of the secondary electrons produced by the x-rays decreases, quantities such as lineal energy (the amount of energy imparted into a microscopic volume over the mean chord length through the volume) increase, leading to increased levels of damage to DNA per given macroscopic absorbed dose. This is generally a small effect (~10% or less depending on specific end points and kVs used). In radiation protection terms, it's considered a wash.
 
Thanks for your reply. I have two additional questions, if that is ok?

Choppy said:
To say that "entrance dose varies proportionally to kV squared" is a little confusing to me because one needs to define this relative to something. X-ray dose is usually measured in proportion to the dose rate on the surface and then the surface dose is just a factor of how long you leave your beam on for. Pehaps this is to generate an image with equivalent contrast noise ratio? Higher kV would lead to lower contrast, so more surface dose would be required to reduce the noise. It doesn't strike me as an unreasonable statement, just that a little more information is needed.
So I have learned that if kV increases, more X-rays will penetrate through the body and thus the to achieve the same receptor dose, entrance surface dose can be decreased. Would I be correct in thinking therefore that receptor dose and contrast are not the same?

Also the full equation I have been told is surface dose = (kV^2 * current * time) / (distance from receptor)^2

Choppy said:
The second statement, "as kV increases, effective dose decreases" is technically incorrect. As pointed out above, w_R = 1 for x-rays, regardless of energy. Hence, "effective dose" would be calculated as the same, regardless of the beam spectrum. What the statement may refer to is something called "biologically effective dose" or BED, which is a little more subtle. There is some evidence that as the mean energy of the secondary electrons produced by the x-rays decreases, quantities such as lineal energy (the amount of energy imparted into a microscopic volume over the mean chord length through the volume) increase, leading to increased levels of damage to DNA per given macroscopic absorbed dose. This is generally a small effect (~10% or less depending on specific end points and kVs used). In radiation protection terms, it's considered a wash.
I see. I thought that higher kV X-rays were more penetrating and so less likely to be absorbed, hence effective dose decreases. I didn't think it had anything to do with secondary electrons? Is this definitley incorrect as my lecturer made this point a few times.

Thanks
 
BobP said:
So I have learned that if kV increases, more X-rays will penetrate through the body and thus the to achieve the same receptor dose, entrance surface dose can be decreased. Would I be correct in thinking therefore that receptor dose and contrast are not the same?
Okay, so you're looking at surface dose in relation to a detector or receptor that's beyond the body. In that case, yes, it make sense that surface dose is going to increase with kV. Higher energy x-rays will be attenuated less and therefore less surface dose will be required for a given dose level beyond the patient. Receptor dose and contrast are not the same. Contrast is defined as a difference in signal intensity due to the present of an object with different radiological properties than the medium around it.

Also the full equation I have been told is surface dose = (kV^2 * current * time) / (distance from receptor)^2
Seems reasonably to me. "kV" can be a little tricky though, so that part is likely an approximation or a rule of thumb. Remember that kV is really a kilovoltage spectrum defined by a peak voltage (kVp) and a number of other factors such as filtration and anode properties. Often a spectrum is characterized by it's half value layer. What the means is that I can have two beams that might be called 60 kVp spectra, but they could have somewhat different penetrating properties if one of them has a 1 mm Al filter and the other doesn't.

I see. I thought that higher kV X-rays were more penetrating and so less likely to be absorbed, hence effective dose decreases. I didn't think it had anything to do with secondary electrons?
The first part is correct. The second part ("dose decreases" - I'll return to the "effective" part in a moment) has to specify in relation to what.

Is this definitley incorrect as my lecturer made this point a few times.
You should ask your instructor about the ICRP (see ICRP Publication 103 [2007]) definition of "effective dose." My guess is that he or she is using the term in a non-radiation protection context, but anyone who has worked with x-rays should be familiar with this concept.
 
Choppy said:
Okay, so you're looking at surface dose in relation to a detector or receptor that's beyond the body. In that case, yes, it make sense that surface dose is going to increase with kV. Higher energy x-rays will be attenuated less and therefore less surface dose will be required for a given dose level beyond the patient. Receptor dose and contrast are not the same. Contrast is defined as a difference in signal intensity due to the present of an object with different radiological properties than the medium around it.Seems reasonably to me. "kV" can be a little tricky though, so that part is likely an approximation or a rule of thumb. Remember that kV is really a kilovoltage spectrum defined by a peak voltage (kVp) and a number of other factors such as filtration and anode properties. Often a spectrum is characterized by it's half value layer. What the means is that I can have two beams that might be called 60 kVp spectra, but they could have somewhat different penetrating properties if one of them has a 1 mm Al filter and the other doesn't.The first part is correct. The second part ("dose decreases" - I'll return to the "effective" part in a moment) has to specify in relation to what.You should ask your instructor about the ICRP (see ICRP Publication 103 [2007]) definition of "effective dose." My guess is that he or she is using the term in a non-radiation protection context, but anyone who has worked with x-rays should be familiar with this concept.

Thanks :)
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
4K
Replies
2
Views
4K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 47 ·
2
Replies
47
Views
10K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 4 ·
Replies
4
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K