Debunked: 14,000 Fukushima Deaths in U.S.

  • Fukushima
  • Thread starter SpunkyMonkey
  • Start date
In summary, a recent study by Mangano and Sherman claimed that 14,000 U.S. deaths were tied to the Fukushima reactor disaster fallout. However, after a statistician analyzed the data, it was found that there were several statistical problems and the infant-death data reported by the study did not match the actual CDC data. The mismatched data points were biased and did not show a statistically significant increase in post-Fukushima infant deaths. The credibility of the study's journal has also been questioned.
  • #1
SpunkyMonkey
66
1
Last Monday a press release announced the shocking result of a new study:

Medical[/PLAIN] Journal Article: 14,000 U.S. Deaths Tied to Fukushima Reactor Disaster Fallout


Immediately seeing major problems with that study by Mangano & Sherman (M&S), I asked a statistician what he thought of it. He crunched the data and while he found several devastating statistical problems, his most remarkable finding was that the U.S. infant-death data M&S report as being from the CDC does not jibe with the actual CDC infant-death data for the same weeks.

The M&S infant-death data allegedly from the CDC can be http://www.radiation.org/reading/pubs/HS42_1F.pdf (go to Table 3, page 55). And the actual CDC infant-death data can be seen here (go to Locations, scroll down and select Total and press Submit for the data; the data for infants is in the Age column entitled "Less than 1"). The mismatching data sets are included at the end of this post, and with the links I've provided here, everything I'm saying can be independently confirmed by the reader.

Here are the mismatching data sets, note that post-Fuku weeks 15 through 24 do match:

2010 (weeks 50-52) 2011 (weeks 1-25)

wk http://www.radiation.org/reading/pubs/HS42_1F.pdf CDC
50 : 202 216
51 : 129 143
52 : 113 130
1 : 158 183
2 : 177 208
3 : 158 185
4 : 148 171
5 : 178 208
6 : 173 182
7 : 188 206
8 : 158 186
9 : 174 199
10 : 165 182
11 : 188 209
12 : 201 211
13 : 210 213
14 : 198 204
15 : 163 163
16 : 188 188
17 : 200 200
18 : 196 196
19 : 214 214
20 : 224 224
21 : 196 196
22 : 152 152
23 : 174 174
24 : 191 191
25 : 215 217


The nature of the mismatch is that all the pre-Fukushima M&S data points are lower than the actual CDC data points and bias the data set to to a statistically significant increase in post-Fukushima infant deaths. But in the actual CDC data, there is no statistically significant increase. The statistician also found that even M&S's data for all-age deaths was in fact not statistically significant, contrary to the claim of M&S.

Why the infant data are mismatched is not understood at this time. However, a review of the archived copies of the Morbidity and Mortality Weekly Report archive finds that the historically released data points for the weeks in question jibe with the CDC's MMWR database. So I see no reason to believe the CDC's online data are not the true data.
 
Last edited by a moderator:
Engineering news on Phys.org
  • #3
rootX said:
It's too speculative and resources you linked don't seem credible enough.

Here's one response to the OP article:
http://nuclearpoweryesplease.org/blog/2011/06/17/shame-on-you-janette-sherman-and-joseph-mangano/
I didn't read through the article.
I'm sorry you don't find the CDC to be credible enough. And my only other links were to the study in question and to its press release. Given that the study is the topic of discussion, linking to it directly is far more credible than linking to a opinion piece on it (and the piece you link to I agree with, but it's on an earlier version of the study). Your comment is nonsense! :)
 
  • #4
SpunkyMonkey said:
I'm sorry you don't find the CDC to be credible enough. And my only other links were to the study in question and to its press release. Given that the study is the topic of discussion, linking to it directly is far more credible than linking to a opinion piece on it (and the piece you link to I agree with, but it's on an earlier version of the study). Your comment is nonsense! :)

To make it clear, I was only referring to the the Mangano & Sherman actual study. I wasn't questing the CDC data. I only read the "14,000 Fukushima Deaths in U.S." in the title and Mangano & Sherman from the OP and found it too ridiculous that I didn't care to pay attention to the rest of the post. I don't know why someone even need to take the Mangano & Sherman study seriously.
 
  • #5
SpunkyMonkey said:
I'm sorry you don't find the CDC to be credible enough. And my only other links were to the study in question and to its press release. Given that the study is the topic of discussion, linking to it directly is far more credible than linking to a opinion piece on it (and the piece you link to I agree with, but it's on an earlier version of the study). Your comment is nonsense! :)

The CDC did not write that article, they simply used CDC data and misrepresented it. The journal they "published" to is not a legitimate journal either as far as we can tell.
 
  • #6
Pengwuino said:
The CDC did not write that article, they simply used CDC data and misrepresented it. The journal they "published" to is not a legitimate journal either as far as we can tell.


Um, I linked to the CDC database and rootX said the "resources you linked don't seem credible enough." So becasue I linked to the CDC database you to believe I thought the CDC wrote the study, oy vey!
 
  • #7
Pengwuino said:
The journal they "published" to is not a legitimate journal either as far as we can tell.
I couldn't find any independent review of the journal although I have to admit I only looked at two pages of google results. According to their own website (not that I would take their word for anything now) says that they have been publishing since 1971. Here are a couple of quotes from their website.
IJHS said:
Of the major English-language health policy journals, the IJHS ranks in the top five for frequency of citation of its articles in the scientific literature.
IJHS said:
International Journal of Health Services is a peer refereed journal.
I question the quality of their peer referee process. And they might rank in the bottom 5 for all I know. Have you got some reason to believe they are not what they say they are?
 
  • #8
Jimmy Snyder said:
I couldn't find any independent review of the journal

There's some review of the journal http://www.reportingonhealth.org/blogs/2011/12/20/fukushima-alarmist-claim-obscure-medical-journal-proceed-caution.
 
  • #9
SpunkyMonkey said:
There's some review of the journal http://www.reportingonhealth.org/blogs/2011/12/20/fukushima-alarmist-claim-obscure-medical-journal-proceed-caution.
It sounds like a journal that was living near the edge of the cliff and just fell off of it.
 
  • #11
This explains their data manipulation that resulted in the mismatch above. From page 52 of http://www.radiation.org/reading/pubs/HS42_1F.pdf (bold emphasizes the manipulation):

The 2010–2011 comparison of deaths in weeks 12 to 25 included 119 of
the 122 cities
in the CDC report. Excluded were Fort Worth, Texas; New
Orleans, Louisiana; and Phoenix, Arizona; for these cities, deaths in more
than half of the weeks were reported as “unavailable.” The completeness of
reporting for both periods exceeded 99 percent. For the earlier 14-week periods,
only 104 of the 122 cities
that reported death figures more than 99 percent of
the time were included. For the cities and weeks excluded from the analysis,
see Appendix Table 3.
So Mangano & Sherman's pre-Fuku data (weeks 50 - 11) only included 104 cities (of the 122 cities in the CDC database), but for the post-Fuku data (weeks 12 - 25) they included 119 cities. The exclusion of cities from the pre-Fuku data resulted in a lowering of total-death counts that thereby produced a statistically significant post-Fuku increase in deaths. The statistician I'm communicating with eliminated the same cities from the post-Fuku data so that the number of cities remains constant at 104 across the data set, and then the statistical significance disappears.

A crystal clear example of massaging the data until it fits your thesis -- pathetic!
 
Last edited:
  • #12
Too bad research fraud doesn't carry civil peanalties.

If I sell a can of dog food with the wrong order of ingredients I can loose my license to manufacture pet food.
 
  • #13
Antiphon said:
Too bad research fraud doesn't carry civil peanalties.
In some cases, it could, and in some cases, fraud is a criminal offense. If one's work is funded, then the funding institution could sue one to recover the funding. On the other hand, if one's fraudulent research promotes an agenda of the funding source, then one wouldn't risk a lawsuit.

In many pure scientific cases, scientific conduct might get one fired from job, e.g., at a university.

Of course, these days, I'd expect faulty research to be covered by freedom of speech, much the way that faulty or fraudulent financial information, e.g., AAA ratings on junk financial instruments was considered an opinion covered by freedom of speech.
 
  • #14
Civil? It should be criminal. Andrew Wakefield, for example, should be tried for homicide for every measles-related death caused by his fraudulent (and bought and paid for) study.

Ideally in Texas.
 
  • #15
Of the major English-language health policy journals, the IJHS ranks in the top five for frequency of citation of its articles in the scientific literature.
Hmm ... http://www.scimagojr.com/journalrank.php?category=2719 seems to rank it #55 out of 120, not #5.

But I could believe that 14,000 new-age airheads went into a state of blind panic and starved their kids to death for fear of feeding them something dangerous... :devil:
 
  • #16
I also find it disturbing that someone would actually report on this study:
http://www.prnewswire.com/news-rele...shima-reactor-disaster-fallout-135859288.html (from OP)
Whoever wrote the news article seems to be consulted only two people, Joseph Mangano and Janette Sherman.

AlephZero said:
But I could believe that 14,000 new-age airheads went into a state of blind panic and starved their kids to death for fear of feeding them something dangerous... :devil:
That could have been a valid study to believe in :rofl:

SpunkyMonkey said:
Um, I linked to the CDC database and rootX said the "resources you linked don't seem credible enough." So becasue I linked to the CDC database you to believe I thought the CDC wrote the study, oy vey!
Yes, I agree that my comment was nonsense. It took me few hours to realize that you were actually trying to debunk a study that I dismissed just looking at the title :blushing:
 
Last edited:
  • #17
Also from the paper,
The gap in changes for infant deaths (+1.80% in the latter 14 weeks, –8.37% for the earlier 14 weeks) was even larger.​
If a 1.80% deviation is statistically significant, what does that make a deviation that is over four times larger? This must prove that those babies born before Fukushima were somehow prescient and knew that they had to stay alive to offset the post-Fukushima deaths.

Or maybe it means that the shorter a time interval one looks at, the larger the deviation. Or it could just mean that the authors cooked the books.
 
  • #18
Antiphon said:
Too bad research fraud doesn't carry civil peanalties.
They do admit to what they did that resulted in the data mismatch above. See the quote in my last comment above. So it's technically not fraud, but imo it's still deceptive and in the overall presentation via the media amounts to strategic lying. Including more cities in the post- than in the pre-Fukushima data set is just shameless, even if admitted in two sentences in an 18-page paper.
 
  • #19
SpunkyMonkey said:
Um, I linked to the CDC database and rootX said the "resources you linked don't seem credible enough." So becasue I linked to the CDC database you to believe I thought the CDC wrote the study, oy vey!

I was commenting on the actual PR article you posted, not the CDC's data.
 
  • #20
  • #21
Pengwuino said:
I was commenting on the actual PR article you posted, not the CDC's data.
Your reply to me opened: "The CDC did not write that article, they simply used CDC data and misrepresented it." So presumably you felt that statement was increasing my knowledge. But in fact I never thought the CDC wrote the article, and nothing I said implied that I did. So your statement wasn't informing me of anything I didn't already know. Though I do appreciate the desire to inform where it's assumed to be needed. :)
 
Last edited:
  • #22
infantDeathGraphs.png

Mangano & Sherman 2011 said:
The 2010–2011 comparison of deaths in weeks 12 to 25 included 119 of the 122 cities in the CDC report. Excluded were Fort Worth, Texas; New Orleans, Louisiana; and Phoenix, Arizona; for these cities, deaths in more than half of the weeks were reported as “unavailable.” The completeness of reporting for both periods exceeded 99 percent. For the earlier 14-week periods, only 104 of the 122 cities that reported death figures more than 99 percent of the time were included. For the cities and weeks excluded from the analysis, see Appendix Table 3.

A statistically significant increase only exists for the http://www.radiation.org/reading/pubs/HS42_1F.pdf green-line.
 
  • #23
InfantMortality2001-2011.png
 
  • #24
SpunkyMonkey said:
Your reply to me opened: "The CDC did not write that article, they simply used CDC data and misrepresented it." So presumably you felt that statement was increasing my knowledge. But in fact I never thought the CDC wrote the article, and nothing I said implied that I did. So your statement wasn't informing me of anything I didn't already know. Though I do appreciate the desire to inform where it's assumed to be needed. :)

I enjoy telling people things they already know! Makes for much less backtalk :approve:
 
  • #25
SpunkyMonkey said:
InfantMortality2001-2011.png
From that graph, it looks like a short-term deviation of 50 or even 100 is just noise.

There is something significant there. What happened in 2008 or 2009?
 
  • #26
Are these figures actually deaths per some number of births, or does birth rate need to be taken into account here?
 
  • #27
I would be interested in seeing the parents' background (income, employment). I was thinking about the possibility of a correlation between economic troubles and infant deaths.
 
  • #28
D H said:
There is something significant there. What happened in 2008 or 2009?

Right, the decline after '08 is statistically significant. But I have no idea sup with that. ??!
 
  • #29
D H said:
From that graph, it looks like a short-term deviation of 50 or even 100 is just noise.

There is something significant there. What happened in 2008 or 2009?

And contrary to what the graph seems to want to imply, there doesn't seem to be a long-term decline evident in the data. I don't know why they would put that linear fit.
 
  • #30
JaWiB said:
Are these figures actually deaths per some number of births, or does birth rate need to be taken into account here?

Mangano & Sherman only analyzed infant deaths, not infant mortality. And so the data I've posted are also only infant deaths. Not using infant-mortality data is just another entry in the long list of flaws in their argument. Reason being, any increase in infant deaths could simply reflect an increase in live births.

rootX said:
I would be interested in seeing the parents' background (income, employment). I was thinking about the possibility of a correlation between economic troubles and infant deaths.

The data are for deaths in 122 entire cities all across the U.S., so there's probably not too much reason to anticipate a selection bias. But yeah, such an economic effect might still work it's way into the data, and the middle class has been hit hard too. The decline after '08 might reflect fewer planned births due to the costs of child rearing.
 
Last edited:
  • #31
SpunkyMonkey said:
Mangano & Sherman only analyzed infant deaths, not infant mortality. And so the data I've posted are also only infant deaths. Not using infant-mortality data is just another entry in the long list of flaws in their argument. Reason being, any increase in infant deaths could simply reflect an increase in live births.

So the decline around 2008 could just as well be due to declining birth rates as declining infant mortality
 
  • #34
Last edited by a moderator:

1. What is the claim about 14,000 Fukushima deaths in the U.S.?

The claim is that there have been 14,000 deaths in the U.S. caused by the Fukushima nuclear disaster in Japan.

2. Is this claim true?

No, this claim has been debunked by numerous scientific studies and organizations. There is no evidence to support that the Fukushima disaster has caused 14,000 deaths in the U.S.

3. What is the source of this claim?

The source of this claim is a study published by the International Journal of Health Services in 2012. However, this study has been heavily criticized for its flawed methodology and lack of scientific evidence.

4. How was this claim debunked?

This claim was debunked by multiple scientific studies and organizations, including the World Health Organization, the United Nations Scientific Committee on the Effects of Atomic Radiation, and the National Academy of Sciences. These studies have found no evidence of increased deaths in the U.S. due to the Fukushima disaster.

5. What are the potential impacts of spreading false information about Fukushima deaths?

Spreading false information about Fukushima deaths can cause unnecessary fear and panic among the public. It can also divert attention and resources away from important issues and concerns. It is important to rely on credible sources and scientific evidence when discussing such sensitive topics.

Similar threads

  • Programming and Computer Science
2
Replies
41
Views
4K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
14K
Back
Top