Falsification of data is the worst scientific crime I can think of. I hope he never works in his field again.
How can he imagine he'd have gotten away with it? (By the way, what is "it"? The article cited was very vague; it's as if they didn't know what, exactly, was falsified).
If we learned anything from Freud and Margaret Mead, it is this: if you are going to falsify data, do it as a social scientist. That way, as long as your ideas "make sense," they'll be accepted anyway.
Pattylou posted a topic in Politics, titled Integrity, that referred to another article on this story. https://www.physicsforums.com/showthread.php?t=97292 That other article, http://www.boston.com/news/local/ma...29/more_doubts_raised_on_fired_mit_professor/ , explains the falsification in more detail and shows the falsified data.
I'm surprised he managed to get away with it for so long, but am glad he was caught and fired. Rest assured, his career in science is done. There's no way anyone would hire him after seeing this plastered all over the news (he wouldn't be able to get any letters of reference either).
This is the most serious type of academic dishonesty, and it is dealt with very harshly. One of my colleagues caught a former grad student of hers falsifying data, and he was very swiftly kicked out of the graduate program. He was working part time in industry too, and they were the first to catch that he was getting "too perfect" of results and couldn't "find" the original data for all of it (that's how my colleague got tipped off to check into his graduate research more closely), so he lost his job as well. In that case, it was fortunately caught very early, so nothing was ever published using his falsified data. But that's what surprises me is that most academic advisors will go over a student's data with a fine tooth comb before allowing them to publish, not necessarily looking for dishonesty, but just looking for mistakes, errors in data entry or statistics, or that they performed a procedure incorrectly, etc...common errors that you have to send them back to fix before something is ready to publish. One of the first things I look for in my student's figures is two graphs that look too similar...not usually dishonesty, but more often carelessness in copying raw data into a spreadsheet for graphing that requires they learn to check, double check and triple check every number entered.
how much would it cost to verify every paper he submitted (e.g. do all the experiments he did) ?
murder murder murder....
To redo the experiments would be very expensive. It may be less costly to go back to lab notebooks and raw data and determine: 1) if the raw data is there, 2) if there's any indication it was falsified - too much similarity, etc., 3) if the raw data matches the published results. Anything published within the past 10 years should still be stored. If the people this guy has worked with are NIH funded, I'll bet they'll be investigating too. The real cost is that even if this was the only time he falsified data, every single publication with his name on it is now going to be viewed with suspicion, and anyone who is doing additional work in that area is going to have to confirm it in order to proceed.
Well thats a bummer - whatever happened to the interconnectedness of scientific postulates? Isnt there one framework to which all of your research simply complies? I mean thats how I think of science - it has to be interconnected and verifiable through different angles from different disciplines.
Yes, but now you've lost one level of that verification, and sometimes it's just one group approaching the same problem from many directions that have contributed a lot to a field. If any of these studies were key findings, there could be a lot of people off on wild goose chases based on those papers. I was also thinking, what if there's some unfortunate grad student trying to replicate this data as a control for their thesis work, who has been banging their head against a wall for 3 years trying to understand why their results never come out right? Too often, if you have a new grad student, the temptation is to assume the student is doing something wrong rather than assuming the earlier study might have been flawed (or falsified)...it takes a while before you have enough confidence in their skills to start doubting the prior study's results, and then it can be really difficult to publish it if you can't find an explanation for the discrepancies.
Separate names with a comma.