Debunked: 14,000 Fukushima Deaths in U.S.

  1. Last Monday a press release announced the shocking result of a new study:

    Medical Journal Article: 14,000 U.S. Deaths Tied to Fukushima Reactor Disaster Fallout


    Immediately seeing major problems with that study by Mangano & Sherman (M&S), I asked a statistician what he thought of it. He crunched the data and while he found several devastating statistical problems, his most remarkable finding was that the U.S. infant-death data M&S report as being from the CDC does not jibe with the actual CDC infant-death data for the same weeks.

    The M&S infant-death data allegedly from the CDC can be seen here (go to Table 3, page 55). And the actual CDC infant-death data can be seen here (go to Locations, scroll down and select Total and press Submit for the data; the data for infants is in the Age column entitled "Less than 1"). The mismatching data sets are included at the end of this post, and with the links I've provided here, everything I'm saying can be independently confirmed by the reader.

    Here are the mismatching data sets, note that post-Fuku weeks 15 through 24 do match:

    2010 (weeks 50-52) 2011 (weeks 1-25)

    wk M&S CDC
    50 : 202 216
    51 : 129 143
    52 : 113 130
    1 : 158 183
    2 : 177 208
    3 : 158 185
    4 : 148 171
    5 : 178 208
    6 : 173 182
    7 : 188 206
    8 : 158 186
    9 : 174 199
    10 : 165 182
    11 : 188 209
    12 : 201 211
    13 : 210 213
    14 : 198 204
    15 : 163 163
    16 : 188 188
    17 : 200 200
    18 : 196 196
    19 : 214 214
    20 : 224 224
    21 : 196 196
    22 : 152 152
    23 : 174 174
    24 : 191 191
    25 : 215 217


    The nature of the mismatch is that all the pre-Fukushima M&S data points are lower than the actual CDC data points and bias the data set to to a statistically significant increase in post-Fukushima infant deaths. But in the actual CDC data, there is no statistically significant increase. The statistician also found that even M&S's data for all-age deaths was in fact not statistically significant, contrary to the claim of M&S.

    Why the infant data are mismatched is not understood at this time. However, a review of the archived copies of the Morbidity and Mortality Weekly Report archive finds that the historically released data points for the weeks in question jibe with the CDC's MMWR database. So I see no reason to believe the CDC's online data are not the true data.
     
    Last edited: Dec 24, 2011
  2. jcsd

  3. I'm sorry you don't find the CDC to be credible enough. And my only other links were to the study in question and to its press release. Given that the study is the topic of discussion, linking to it directly is far more credible than linking to a opinion piece on it (and the piece you link to I agree with, but it's on an earlier version of the study). Your comment is nonsense! :)
     
  4. To make it clear, I was only referring to the the Mangano & Sherman actual study. I wasn't questing the CDC data. I only read the "14,000 Fukushima Deaths in U.S." in the title and Mangano & Sherman from the OP and found it too ridiculous that I didn't care to pay attention to the rest of the post. I don't know why someone even need to take the Mangano & Sherman study seriously.
     
  5. Pengwuino

    Pengwuino 7,118
    Gold Member

    The CDC did not write that article, they simply used CDC data and misrepresented it. The journal they "published" to is not a legitimate journal either as far as we can tell.
     

  6. Um, I linked to the CDC database and rootX said the "resources you linked don't seem credible enough." So becasue I linked to the CDC database you to believe I thought the CDC wrote the study, oy vey!
     
  7. I couldn't find any independent review of the journal although I have to admit I only looked at two pages of google results. According to their own website (not that I would take their word for anything now) says that they have been publishing since 1971. Here are a couple of quotes from their website.
    I question the quality of their peer referee process. And they might rank in the bottom 5 for all I know. Have you got some reason to believe they are not what they say they are?
     
  8. There's some review of the journal in this critical blog.
     
  9. It sounds like a journal that was living near the edge of the cliff and just fell off of it.
     
  10. D H

    Staff: Mentor

  11. This explains their data manipulation that resulted in the mismatch above. From page 52 of their paper (bold emphasizes the manipulation):


    So Mangano & Sherman's pre-Fuku data (weeks 50 - 11) only included 104 cities (of the 122 cities in the CDC database), but for the post-Fuku data (weeks 12 - 25) they included 119 cities. The exclusion of cities from the pre-Fuku data resulted in a lowering of total-death counts that thereby produced a statistically significant post-Fuku increase in deaths. The statistician I'm communicating with eliminated the same cities from the post-Fuku data so that the number of cities remains constant at 104 across the data set, and then the statistical significance disappears.

    A crystal clear example of massaging the data until it fits your thesis -- pathetic!
     
    Last edited: Dec 25, 2011
  12. Too bad research fraud doesn't carry civil peanalties.

    If I sell a can of dog food with the wrong order of ingredients I can loose my license to manufacture pet food.
     
  13. Astronuc

    Staff: Mentor

    In some cases, it could, and in some cases, fraud is a criminal offense. If one's work is funded, then the funding institution could sue one to recover the funding. On the other hand, if one's fraudulent research promotes an agenda of the funding source, then one wouldn't risk a lawsuit.

    In many pure scientific cases, scientific conduct might get one fired from job, e.g., at a university.

    Of course, these days, I'd expect faulty research to be covered by freedom of speech, much the way that faulty or fraudulent financial information, e.g., AAA ratings on junk financial instruments was considered an opinion covered by freedom of speech.
     
  14. Vanadium 50

    Vanadium 50 17,438
    Staff Emeritus
    Science Advisor
    Gold Member

    Civil? It should be criminal. Andrew Wakefield, for example, should be tried for homicide for every measles-related death caused by his fraudulent (and bought and paid for) study.

    Ideally in Texas.
     
  15. AlephZero

    AlephZero 7,300
    Science Advisor
    Homework Helper

    Hmm ... http://www.scimagojr.com/journalrank.php?category=2719 seems to rank it #55 out of 120, not #5.

    But I could believe that 14,000 new-age airheads went into a state of blind panic and starved their kids to death for fear of feeding them something dangerous... :devil:
     
  16. I also find it disturbing that someone would actually report on this study:
    http://www.prnewswire.com/news-rele...shima-reactor-disaster-fallout-135859288.html (from OP)
    Whoever wrote the news article seems to be consulted only two people, Joseph Mangano and Janette Sherman.

    That could have been a valid study to believe in :rofl:

    Yes, I agree that my comment was nonsense. It took me few hours to realize that you were actually trying to debunk a study that I dismissed just looking at the title :blushing:
     
    Last edited: Dec 25, 2011
  17. D H

    Staff: Mentor

    Also from the paper,
    The gap in changes for infant deaths (+1.80% in the latter 14 weeks, –8.37% for the earlier 14 weeks) was even larger.​
    If a 1.80% deviation is statistically significant, what does that make a deviation that is over four times larger? This must prove that those babies born before Fukushima were somehow prescient and knew that they had to stay alive to offset the post-Fukushima deaths.

    Or maybe it means that the shorter a time interval one looks at, the larger the deviation. Or it could just mean that the authors cooked the books.
     

  18. They do admit to what they did that resulted in the data mismatch above. See the quote in my last comment above. So it's technically not fraud, but imo it's still deceptive and in the overall presentation via the media amounts to strategic lying. Including more cities in the post- than in the pre-Fukushima data set is just shameless, even if admitted in two sentences in an 18-page paper.
     
  19. Pengwuino

    Pengwuino 7,118
    Gold Member

    I was commenting on the actual PR article you posted, not the CDC's data.
     
  20. Evo

    Staff: Mentor

Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?
Similar discussions for: Debunked: 14,000 Fukushima Deaths in U.S.
Loading...