Low Energy X-Rays for Mammograms: Why It's Necessary

  • Thread starter Thread starter Saxby
  • Start date Start date
  • Tags Tags
    Cancer Energy
Click For Summary

Discussion Overview

The discussion centers on the necessity of using relatively low energy X-rays in mammograms, exploring the implications for cancer risk and image quality. Participants examine the biological effectiveness of these X-rays, the importance of contrast in detecting microcalcifications, and the potential risks associated with different energy levels.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Conceptual clarification

Main Points Raised

  • One participant notes that low energy X-rays used in mammograms have a low risk of inducing cancer, suggesting that increasing energy levels would raise this risk.
  • Another participant references a study indicating that low energy X-rays may be significantly more effective in causing mutational damage compared to higher energy X-rays, potentially underestimating the risks of radiation-induced breast cancers.
  • A different viewpoint emphasizes the importance of contrast in mammography, explaining that low energy X-rays enhance the detection of microcalcifications due to differences in photoelectric cross sections.
  • One participant challenges the assertion that lower energy increases cancer risk, asking for clarification on why this might be the case.
  • A subsequent reply explains that as energy decreases, the linear energy transfer (LET) increases, leading to more localized energy deposition and a higher likelihood of damaging DNA, which could result in cancer.

Areas of Agreement / Disagreement

Participants express differing views on the relationship between X-ray energy levels and cancer risk, with some asserting that lower energy increases risk while others question this assertion. The discussion remains unresolved regarding the implications of these findings for mammography practices.

Contextual Notes

Participants highlight the complexity of radiation risk assessments and the need for caution, particularly for women under 50 or with a family history of breast cancer. The discussion also touches on the limitations of current risk estimates based on data from atomic bomb survivors.

Saxby
Messages
45
Reaction score
0
Question:
Explain why it is necessary to use relatively low energy X-rays to produce an image of the breast.

My answer:
Currently x-rays energy levels used during mammograms have approximately a one in a million chance of inducing cancer, this means that approximately for every two-hundred cancers found one is caused. This risk level is generally seen as being acceptable. Increasing the energy of these x-rays used would increase the risk of inducing cancer, so low energy levels are used to avoid a threat to the patient.

Does this answer seem OK to everyone? If you think there's anything i have missed or should add let me know :)
 
Biology news on Phys.org
I don't know what the point of your thread is, but here is a recent study,

Controversy exists regarding the biological effectiveness of low energy x-rays used for mammography breast screening. Recent radiobiology studies have provided compelling evidence that these low energy x-rays may be 4.42 +/- 2.02 times more effective in causing mutational damage than higher energy x-rays. These data include a study involving in vitro irradiation of a human cell line using a mammography x-ray source and a high energy source which matches the spectrum of radiation observed in survivors from the Hiroshima atomic bomb. Current radiation risk estimates rely heavily on data from the atomic bomb survivors, and a direct comparison between the diagnostic energies used in the UK breast screening programme and those used for risk estimates can now be made. Evidence highlighting the increase in relative biological effectiveness (RBE) of mammography x-rays to a range of x-ray energies implies that the risks of radiation-induced breast cancers for mammography x-rays are potentially underestimated by a factor of four. A pooled analysis of three measurements gives a maximal RBE (for malignant transformation of human cells in vitro) of 4.02 +/- 0.72 for 29 kVp (peak accelerating voltage) x-rays compared to high energy electrons and higher energy x-rays. For the majority of women in the UK NHS breast screening programme, it is shown that the benefit safely exceeds the risk of possible cancer induction even when this higher biological effectiveness factor is applied. The risk/benefit analysis, however, implies the need for caution for women screened under the age of 50, and particularly for those with a family history (and therefore a likely genetic susceptibility) of breast cancer. In vitro radiobiological data are generally acquired at high doses, and there are different extrapolation mechanisms to the low doses seen clinically. Recent low dose in vitro data have indicated a potential suppressive effect at very low dose rates and doses. Whilst mammography is a low dose exposure, it is not a low dose rate examination, and protraction of dose should not be confused with fractionation. Although there is potential for a suppressive effect at low doses, recent epidemiological data, and several international radiation risk assessments, continue to promote the linear no-threshold (LNT) model. Finally, recent studies have shown that magnetic resonance imaging (MRI) is more sensitive than mammography in detecting invasive breast cancer in women with a genetic sensitivity. Since an increase in the risk associated with mammographic screening would blur the justification of exposure for this high risk subgroup, the use of other (non-ionising) screening modalities is preferable.

http://www.ncbi.nlm.nih.gov/pubmed/19454801
 
I think you're missing the point of the question.

In mammography contrast is key. You're trying to detect microcalcifications which are indicators of cancer, and these are very small in dimension.

Contrast arises from differences in linear attenuation coefficients between materials, which in turn arises from differences in the interaction cross sections. The photoelectric cross section varies with the inverse cube of photon energy (approximately), and directly with the cube of the material's effective atomic number. This is the dominant process at most x-ray imaging energies.

By using a relatively low energy for mammography the very small microcalcifications can be detected because they have differences in their photoelectric cross sections. As energy increases, these differences diminish and you get less contrast.

You can't go too low in energy, of course, because everything would be attenuated too much and you would loose too much signal for the noise produced.

Cancer induction is indeed a concern for screening programs. The RBE of ~4 cited in the study above is actually rather high in my experience although this depends strongly on its definition and the end-point used (I've done some research in this area), but the fact of the matter is that the probability of inducing cancer for a given dose will increase as energy goes down. So I think you're off track with your current answer.
 
Choppy said:
...but the fact of the matter is that the probability of inducing cancer for a given dose will increase as energy goes down.
That is surprising. Can you explain why that is?
 
russ_watters said:
That is surprising. Can you explain why that is?

Short version: LET increases with decreasing energy.

Longer version:
We're comparing equivalent doses - so the same amount of energy per unit mass is deposited. What changes then are patterns of energy deposition in relation to a target - usually assumed to be the cell's DNA.

As electrons slow down in media they tend to deposit the majority of their energy towards the end of the track - leading to the "Bragg peak." (Note that I'm talking in terms of track length - electrons scatter all over the place so you don't see a Bragg peak in a depth-dose curve from an electron beam as you might from other heavier ion radiation.) LET - linear energy transfer - refers to the energy that's deposited locally in the immediate vicinity (about one micron in water) of the electron's track. As an electron slows, the energy deposited locally increases. So you get ionization events that are closer together.

DNA is a "double helix" molecule. Cells are generally pretty good at repairing single strand breaks in the DNA, because when the other side is intact it serves as a template for the broken one. But when both strands are broken, the repair is a lot more difficult. In that case the repair process can lead to deletions or other errors in the chain. Those can then go on to cause cancer down the road.

So the idea is that the double strand breaks that lead to cancer are more likely when you have an increased density of ionization events along a single track of radiation.
 

Similar threads

Replies
2
Views
4K
  • · Replies 4 ·
Replies
4
Views
5K
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
2
Views
4K
Replies
66
Views
10K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
6K