Low Energy X-Rays for Mammograms: Why It's Necessary

In summary: LNT) model?In summary, the radiobiological data indicate that using relatively low energy x-rays for mammography has a high potential to cause mutation in cells. This is due to the increased radiation risk associated with using higher energy x-rays. However, the increased risk is outweighed by the benefits of detecting microcalcifications in breast tissue.
  • #1
Saxby
45
0
Question:
Explain why it is necessary to use relatively low energy X-rays to produce an image of the breast.

My answer:
Currently x-rays energy levels used during mammograms have approximately a one in a million chance of inducing cancer, this means that approximately for every two-hundred cancers found one is caused. This risk level is generally seen as being acceptable. Increasing the energy of these x-rays used would increase the risk of inducing cancer, so low energy levels are used to avoid a threat to the patient.

Does this answer seem OK to everyone? If you think there's anything i have missed or should add let me know :)
 
Biology news on Phys.org
  • #2
I don't know what the point of your thread is, but here is a recent study,

Controversy exists regarding the biological effectiveness of low energy x-rays used for mammography breast screening. Recent radiobiology studies have provided compelling evidence that these low energy x-rays may be 4.42 +/- 2.02 times more effective in causing mutational damage than higher energy x-rays. These data include a study involving in vitro irradiation of a human cell line using a mammography x-ray source and a high energy source which matches the spectrum of radiation observed in survivors from the Hiroshima atomic bomb. Current radiation risk estimates rely heavily on data from the atomic bomb survivors, and a direct comparison between the diagnostic energies used in the UK breast screening programme and those used for risk estimates can now be made. Evidence highlighting the increase in relative biological effectiveness (RBE) of mammography x-rays to a range of x-ray energies implies that the risks of radiation-induced breast cancers for mammography x-rays are potentially underestimated by a factor of four. A pooled analysis of three measurements gives a maximal RBE (for malignant transformation of human cells in vitro) of 4.02 +/- 0.72 for 29 kVp (peak accelerating voltage) x-rays compared to high energy electrons and higher energy x-rays. For the majority of women in the UK NHS breast screening programme, it is shown that the benefit safely exceeds the risk of possible cancer induction even when this higher biological effectiveness factor is applied. The risk/benefit analysis, however, implies the need for caution for women screened under the age of 50, and particularly for those with a family history (and therefore a likely genetic susceptibility) of breast cancer. In vitro radiobiological data are generally acquired at high doses, and there are different extrapolation mechanisms to the low doses seen clinically. Recent low dose in vitro data have indicated a potential suppressive effect at very low dose rates and doses. Whilst mammography is a low dose exposure, it is not a low dose rate examination, and protraction of dose should not be confused with fractionation. Although there is potential for a suppressive effect at low doses, recent epidemiological data, and several international radiation risk assessments, continue to promote the linear no-threshold (LNT) model. Finally, recent studies have shown that magnetic resonance imaging (MRI) is more sensitive than mammography in detecting invasive breast cancer in women with a genetic sensitivity. Since an increase in the risk associated with mammographic screening would blur the justification of exposure for this high risk subgroup, the use of other (non-ionising) screening modalities is preferable.

http://www.ncbi.nlm.nih.gov/pubmed/19454801
 
  • #3
I think you're missing the point of the question.

In mammography contrast is key. You're trying to detect microcalcifications which are indicators of cancer, and these are very small in dimension.

Contrast arises from differences in linear attenuation coefficients between materials, which in turn arises from differences in the interaction cross sections. The photoelectric cross section varies with the inverse cube of photon energy (approximately), and directly with the cube of the material's effective atomic number. This is the dominant process at most x-ray imaging energies.

By using a relatively low energy for mammography the very small microcalcifications can be detected because they have differences in their photoelectric cross sections. As energy increases, these differences diminish and you get less contrast.

You can't go too low in energy, of course, because everything would be attenuated too much and you would loose too much signal for the noise produced.

Cancer induction is indeed a concern for screening programs. The RBE of ~4 cited in the study above is actually rather high in my experience although this depends strongly on its definition and the end-point used (I've done some research in this area), but the fact of the matter is that the probability of inducing cancer for a given dose will increase as energy goes down. So I think you're off track with your current answer.
 
  • #4
Choppy said:
...but the fact of the matter is that the probability of inducing cancer for a given dose will increase as energy goes down.
That is surprising. Can you explain why that is?
 
  • #5
russ_watters said:
That is surprising. Can you explain why that is?

Short version: LET increases with decreasing energy.

Longer version:
We're comparing equivalent doses - so the same amount of energy per unit mass is deposited. What changes then are patterns of energy deposition in relation to a target - usually assumed to be the cell's DNA.

As electrons slow down in media they tend to deposit the majority of their energy towards the end of the track - leading to the "Bragg peak." (Note that I'm talking in terms of track length - electrons scatter all over the place so you don't see a Bragg peak in a depth-dose curve from an electron beam as you might from other heavier ion radiation.) LET - linear energy transfer - refers to the energy that's deposited locally in the immediate vicinity (about one micron in water) of the electron's track. As an electron slows, the energy deposited locally increases. So you get ionization events that are closer together.

DNA is a "double helix" molecule. Cells are generally pretty good at repairing single strand breaks in the DNA, because when the other side is intact it serves as a template for the broken one. But when both strands are broken, the repair is a lot more difficult. In that case the repair process can lead to deletions or other errors in the chain. Those can then go on to cause cancer down the road.

So the idea is that the double strand breaks that lead to cancer are more likely when you have an increased density of ionization events along a single track of radiation.
 

1. Why is low-energy X-rays necessary for mammograms?

Low-energy X-rays are necessary for mammograms because they are better able to penetrate the dense breast tissue without causing harm. This allows for more accurate images of the breast tissue, making it easier to detect any abnormalities or signs of breast cancer.

2. How do low-energy X-rays differ from regular X-rays?

Low-energy X-rays have a lower energy level than regular X-rays, meaning they have a shorter wavelength and are less penetrating. This makes them safer for use in mammograms, as they are less likely to cause damage to the cells in the breast tissue.

3. Are there any risks associated with using low-energy X-rays for mammograms?

The risks associated with using low-energy X-rays for mammograms are minimal. The amount of radiation exposure is very low and is considered safe for routine use. However, as with any medical procedure, there is a small risk of potential harm, but the benefits of early detection of breast cancer far outweigh this risk.

4. How often should a woman get a mammogram using low-energy X-rays?

The frequency of mammograms using low-energy X-rays varies depending on a woman's age and risk factors for breast cancer. The American Cancer Society recommends that women aged 45-54 should get a mammogram every year, while women aged 55 and older can switch to every two years if they choose. Women at higher risk may need to get mammograms more frequently.

5. Are there any alternatives to using low-energy X-rays for mammograms?

While low-energy X-rays are the most commonly used method for mammograms, there are alternative imaging techniques that can be used. These include ultrasound and MRI, which may be recommended for women with dense breast tissue or for further evaluation if a potential abnormality is detected on a mammogram. However, low-energy X-rays are still considered the gold standard for routine breast cancer screening.

Similar threads

Replies
3
Views
1K
  • High Energy, Nuclear, Particle Physics
Replies
6
Views
1K
Replies
62
Views
3K
Replies
2
Views
3K
  • Electromagnetism
Replies
1
Views
705
  • Science and Math Textbooks
Replies
4
Views
1K
  • Introductory Physics Homework Help
Replies
2
Views
2K
  • Biology and Medical
Replies
4
Views
5K
  • High Energy, Nuclear, Particle Physics
Replies
6
Views
4K
Replies
9
Views
1K
Back
Top