Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Estimating radiation risk

  1. Dec 2, 2009 #1
    Say I have a radioactive source in my vicinity and I want to measure the exposure dosis I'm receiving with a pen dosimeter. I measure it to be 1 mR per hour. However, the meter is tiny, which means it only receives a fraction of the radiation I'm receiving, because I have a much larger surface area. Would I have to multiply the result of my 1 mR/hour result by some random number to account for this or would I only receive 1 mR per hour of exposure too?
     
  2. jcsd
  3. Dec 2, 2009 #2

    bcrowell

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Rems and sieverts are basically measuring ionization per kilogram of body mass. If the dosimeter is calibrated in millirem, then it's calibrated according to an estimate of the actual whole-body dose you'd receive. No conversion is necessary.

    For a dose this size, "risk" is the wrong word. Doses of up to approximately 1 rem per day *improve* your health. Experiments show that low levels of radiation activate cellular damage control mechanisms, increasing the health of the organism. For example, exposure to radiation up to a certain level makes mice grow faster; makes guinea pigs' immune systems function better against diptheria; increases fertility in female humans, trout, and mice; improves fetal mice's resistance to disease; reducess genetic abnormalities in humans; increases the life-spans of flour beetles and mice; and reduces mortality from cancer in mice and humans.

    So a dose of 1 mrem/hr is a deficiency, sort of like not getting enough vitamin C or iodine in your diet.
     
  4. Dec 2, 2009 #3
    Do you mean millirem per day? Cosmic rays are in the range of 100 to 300 millirem per year.
    Bob S.
     
  5. Dec 3, 2009 #4

    Vanadium 50

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    I disagree that "1 mrem/hr is a deficiency, sort of like not getting enough vitamin C or iodine in your diet", as, I suspect would most health physicists.

    There is some evidence that low levels of radiation are beneficial, although these studies are controversial - at these low levels it's hard to see any effect, either good or bad. Most health physicists adopt the LDNT model: linear dose, no threshold. This says, essentially, that doubling (halving) the radiation doubles (halves) the risk.

    Background radiation is around 1 mR/day, depending on many factors - especially geography, matrimony and smoking. So we're talking ~25 times background. At 5 mR/hour, an area must be posted as a "Radiation Area".

    The US federal limit is 5000 mR/year for radiation workers. Most employers set lower thresholds - the University of Michigan is 1/10 that. 500 mR/year works out to about 2 hours per day for working with an exposure of 1mR/hour. The "industry standard" is ALARA - as low as reasonably achievable, and the key is time, shielding and distance. When working with sources, one should always minimize time, and maximize shielding and distance.
     
  6. Dec 3, 2009 #5

    bcrowell

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I think you need to be clear here on what you mean by "Most health physicists adopt..." LDNT is simply wrong based on the abundant empirical evidence. If what you mean is that they often act as though LDNT were true, then that's correct. This has to do with culture, history, and law, not with science. If what you mean is that health physicists with modern academic backgrounds believe LDNT is a scientifically viable hypothesis, then I think you're incorrect.

    If you think radiation hormesis is controversial, I suspect your knowledge is out of date. But if you could point me to scientific papers supporting your statement, I'd be interested to see them. Here is a long review article on the subject: http://www.radpro.com/641luckey.pdf [Broken]

    Yes, the data show that natural background radiation levels are far lower than the maximum beneficial dose. In other words, the empirical evidence shows your health would benefit (very slightly) by moving to an area such as Colorado where the background level is higher.

    Yes, the law is out of step with scientific knowledge about radiation. This is probably due to a combination of legal inertia and the political reality of living in a society where a congressional representative's constituents don't understand science. A university's priority is on preventing lawsuits, not on doing what statistically should benefit their workers.

    It's true that ALARA is an industry standard. That doesn't mean it's consistent with modern scientific knowledge about the biological effects of ionizing radiation.
     
    Last edited by a moderator: May 4, 2017
  7. Dec 5, 2009 #6
    I know that matrimony may be hazardous, but how does that relate to 1 mR/day?
    Bob S
     
  8. Dec 5, 2009 #7

    Vanadium 50

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    Sleeping next to someone exposes you to the radiation from the K-40 decay in their bodies.
     
  9. Dec 15, 2009 #8

    bcrowell

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    The discussion with Vanadium 50, along with an article in today's NY Times ( http://www.nytimes.com/reuters/2009/12/14/health/news-cancer-radiation.html?_r=1&scp=5&sq=ct%20scan&st=cse [Broken] ), prompted me to go back and study whether I'd been getting my information from biased sources.

    The following two short papers turn out to be pretty useful, because they advocate opposite points of view about LNT and hormesis, and they're fairly recent. Each explicitly refers to and criticizes the other.

    Tubiana et al., "The Linear No-Threshold Relationship Is Inconsistent with Radiation Biologic and Experimental Data", doi: 10.1148/radiol.2511080671, April 2009 Radiology, 251, 13-22

    Little et al., Risks Associated with Low Doses and Low Dose Rates of Ionizing Radiation: Why Linearity May Be (Almost) the Best We Can Do, doi: 10.1148/radiol.2511081686, April 2009 Radiology, 251, 6-12.

    Unfortunately both are behind a paywall, so I don't know how accessible they'll be to other pf users.

    Essentially there seem to be three ways of knowing about the effects of low doses of ionizing radiation:

    (1) reasoning from knowledge about cell biology

    (2) epidemiological studies in humans

    (3) animal studies

    #1 is not very helpful because it's purely theoretical, and in any case the current theoretical understanding of cancer is very poor. You can cook up arguments based on cell biology and cancer biology to say that the graph of risk versus dose should be anything you like: linear, concave up, concave down, threshold, no threshold, etc.

    #2 is also not very helpful because the studies are mostly not sensitive enough to tell us anything about the kinds of doses that are relevant in terms of policy or individual decision-making. The available sources of data don't involve large enough populations and low enough doses to be very useful, and they often are very difficult to interpret due to the inability to control all the variables. For instance, the Japanese bomb survivors are a unique source of information, but these people were exposed to all kinds of nasty carcinogenic chemicals from the burning of their cities. (Cf. concerns about carcinogens in NY from 9/11. Firefighters have a higher risk of cancer.)

    So Little and Tubiana examine #1 and #2 and reach opposite conclusions, because the data are so weak. And when I say "opposite conclusions," they aren't really strong conclusions. They both say things like "the data are consistent with LNT" or "the data are consistent with a threshold," often referring to the same data. In other words, the data simply don't test LNT.

    #3, animal studies, is the source of information that really has the ability to test LNT, and it uniformly demonstrates that LNT is wrong, and that doses of less than about 10^4 mSv have a beneficial effect (see http://www.radpro.com/641luckey.pdf [Broken] for a review). Unless we believe that there is something drastically different about the cancer biology of humans compared to the cancer biology of lab rats, it would seem logical to me to use this as our source of information about effects in humans.

    Why, then, does the debate in medical journals seem to focus so much more on #1 and #2, which are inconclusive? This editorial http://www.sciencemag.org/cgi/content/summary/302/5644/378 gives some intersting insight:
    This tends to reinforce my belief that behavior and public policy on this topic are mainly determined by culture (50's horror movies about radiation, etc.), psychology (people's inability to reason rationally about risk), and politics (NIMBY) rather than by scientific evidence.
     
    Last edited by a moderator: May 4, 2017
  10. Dec 15, 2009 #9
    I think you are making just as large a leap as in the case of #1 and #2 in assuming that cancer biology in rats can be directly mapped onto cancer biology in people.

    I am not saying which one would be the way to go about it, but it seems to me that if we still cannot quantify risk based on dose for this regime that extrapolating from rats to people may be just as questionable.

    But I definitely think you are correct that there is a lot of culture, psychology and politics going into the debate.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Estimating radiation risk
  1. Beta radiation (Replies: 1)

  2. Radiation Sickness (Replies: 5)

  3. Ionizing Radiation (Replies: 4)

  4. Neutron radiation (Replies: 1)

  5. Cherenkov radiation (Replies: 1)

Loading...