Register to reply

Low energy mammogram

Share this thread:
Saxby
#1
Feb12-14, 04:10 PM
Saxby's Avatar
P: 45
Question:
Explain why it is necessary to use relatively low energy X-rays to produce an image of the breast.

My answer:
Currently x-rays energy levels used during mammograms have approximately a one in a million chance of inducing cancer, this means that approximately for every two-hundred cancers found one is caused. This risk level is generally seen as being acceptable. Increasing the energy of these x-rays used would increase the risk of inducing cancer, so low energy levels are used to avoid a threat to the patient.

Does this answer seem OK to everyone? If you think there's anything i have missed or should add let me know :)
Phys.Org News Partner Medical research news on Phys.org
Senegal monitors contacts of 1st Ebola patient
Snacking while watching action movies leads to overeating
Quality of US diet shows modest improvement, but overall remains poor
Evo
#2
Feb12-14, 04:46 PM
Mentor
Evo's Avatar
P: 26,557
I don't know what the point of your thread is, but here is a recent study,

Controversy exists regarding the biological effectiveness of low energy x-rays used for mammography breast screening. Recent radiobiology studies have provided compelling evidence that these low energy x-rays may be 4.42 +/- 2.02 times more effective in causing mutational damage than higher energy x-rays. These data include a study involving in vitro irradiation of a human cell line using a mammography x-ray source and a high energy source which matches the spectrum of radiation observed in survivors from the Hiroshima atomic bomb. Current radiation risk estimates rely heavily on data from the atomic bomb survivors, and a direct comparison between the diagnostic energies used in the UK breast screening programme and those used for risk estimates can now be made. Evidence highlighting the increase in relative biological effectiveness (RBE) of mammography x-rays to a range of x-ray energies implies that the risks of radiation-induced breast cancers for mammography x-rays are potentially underestimated by a factor of four. A pooled analysis of three measurements gives a maximal RBE (for malignant transformation of human cells in vitro) of 4.02 +/- 0.72 for 29 kVp (peak accelerating voltage) x-rays compared to high energy electrons and higher energy x-rays. For the majority of women in the UK NHS breast screening programme, it is shown that the benefit safely exceeds the risk of possible cancer induction even when this higher biological effectiveness factor is applied. The risk/benefit analysis, however, implies the need for caution for women screened under the age of 50, and particularly for those with a family history (and therefore a likely genetic susceptibility) of breast cancer. In vitro radiobiological data are generally acquired at high doses, and there are different extrapolation mechanisms to the low doses seen clinically. Recent low dose in vitro data have indicated a potential suppressive effect at very low dose rates and doses. Whilst mammography is a low dose exposure, it is not a low dose rate examination, and protraction of dose should not be confused with fractionation. Although there is potential for a suppressive effect at low doses, recent epidemiological data, and several international radiation risk assessments, continue to promote the linear no-threshold (LNT) model. Finally, recent studies have shown that magnetic resonance imaging (MRI) is more sensitive than mammography in detecting invasive breast cancer in women with a genetic sensitivity. Since an increase in the risk associated with mammographic screening would blur the justification of exposure for this high risk subgroup, the use of other (non-ionising) screening modalities is preferable.
http://www.ncbi.nlm.nih.gov/pubmed/19454801
Choppy
#3
Feb12-14, 06:08 PM
Sci Advisor
P: 2,725
I think you're missing the point of the question.

In mammography contrast is key. You're trying to detect microcalcifications which are indicators of cancer, and these are very small in dimension.

Contrast arises from differences in linear attenuation coefficients between materials, which in turn arises from differences in the interaction cross sections. The photoelectric cross section varies with the inverse cube of photon energy (approximately), and directly with the cube of the material's effective atomic number. This is the dominant process at most x-ray imaging energies.

By using a relatively low energy for mammography the very small microcalcifications can be detected because they have differences in their photoelectric cross sections. As energy increases, these differences diminish and you get less contrast.

You can't go too low in energy, of course, because everything would be attenuated too much and you would loose too much signal for the noise produced.

Cancer induction is indeed a concern for screening programs. The RBE of ~4 cited in the study above is actually rather high in my experience although this depends strongly on its definition and the end-point used (I've done some research in this area), but the fact of the matter is that the probability of inducing cancer for a given dose will increase as energy goes down. So I think you're off track with your current answer.

russ_watters
#4
Feb12-14, 10:13 PM
Mentor
P: 22,303
Low energy mammogram

Quote Quote by Choppy View Post
...but the fact of the matter is that the probability of inducing cancer for a given dose will increase as energy goes down.
That is surprising. Can you explain why that is?
Choppy
#5
Feb12-14, 11:29 PM
Sci Advisor
P: 2,725
Quote Quote by russ_watters View Post
That is surprising. Can you explain why that is?
Short version: LET increases with decreasing energy.

Longer version:
We're comparing equivalent doses - so the same amount of energy per unit mass is deposited. What changes then are patterns of energy deposition in relation to a target - usually assumed to be the cell's DNA.

As electrons slow down in media they tend to deposit the majority of their energy towards the end of the track - leading to the "Bragg peak." (Note that I'm talking in terms of track length - electrons scatter all over the place so you don't see a Bragg peak in a depth-dose curve from an electron beam as you might from other heavier ion radiation.) LET - linear energy transfer - refers to the energy that's deposited locally in the immediate vicinity (about one micron in water) of the electron's track. As an electron slows, the energy deposited locally increases. So you get ionization events that are closer together.

DNA is a "double helix" molecule. Cells are generally pretty good at repairing single strand breaks in the DNA, because when the other side is intact it serves as a template for the broken one. But when both strands are broken, the repair is a lot more difficult. In that case the repair process can lead to deletions or other errors in the chain. Those can then go on to cause cancer down the road.

So the idea is that the double strand breaks that lead to cancer are more likely when you have an increased density of ionization events along a single track of radiation.


Register to reply

Related Discussions
Work and Energy (Kinetic Energy/Grav. Potential Energy) Introductory Physics Homework 4
Work-Energy Theorum: Spring potential energy vs Kinetic Energy Introductory Physics Homework 4
Cosmological constant or dark energy or vaccum denisty/energy/energy density Astronomy & Astrophysics 9
HELP! ~ Mechanical Energy vs Potential Energy & Kinetic Energy Introductory Physics Homework 4
Mechanical Energy vs Potential Energy & Kinetic Energy Introductory Physics Homework 3