Blaming Government for Teacher and Scientist Failures in Integrity
The article, “Governmental policy is wrecking science,” makes some interesting points but is fundamentally in error, because government policy is only a small part of the problem. The government is depending on scientists and teachers to police students and each other regarding scientific and academic integrity. To a harmful extent, scientists and teachers are failing to effectively carry out this trust.
Catching plagiarism has gotten much easier in the past few years due to automated detection software. Students are having a hard time fooling it. Most scientific publishers are using it now. But it only works if teachers use it.
Catching fake data is often straightforward, but it requires paying attention and running a few statistical and possibly other tests. Back in 2008, my wife and I caught errors in a biomechanics paper and published a reply because it was obvious in the graphs that the data violated the Work-Energy theorem. My wife and her colleagues recently published a comment pointing out data dishonesty in an important bone paper. Something smelled fishy, so she asked me to read it. I agreed and encouraged her to dig deeper. She dug up the Master’s thesis with the original data and uncovered the sleight of hand. In 2010, I caught an atomic physics paper that had copied several paragraphs verbatim (without attribution) from one of my papers from the 1990s. Instead of a retraction, the editors let it slide with a corrigendum and citation after the fact.
We also caught errors in the weight-length data at Fishbase.org and published a paper on it in 2010 or 2011. In this case, my wife alerted me that something was afoul, and some cadets at the Air Force Academy made a project out of it under my oversight. The database editors vilified us for pointing it out, but they have since gotten a lot better at error checking and correction. We later traced most of the errors to a single source: one of the most cited handbooks on freshwater fisheries biology.
Similarly, we caught several both scientific and statistical errors in a 2011 Fishery Bulletin paper on magnetoreception in fish. The editor published an erratum correcting the statistical errors but declined to publish our comment pointing out the unsupported claims in the abstract and other scientific errors. There was no suggestion our comment was wrong, but the journal simply has an editorial policy of not publishing comments that bring to light scientific errors in their papers. Refusing to publish corrections for clear scientific errors is a failure of scientific integrity that falls on scientific authors and editors rather than the government.
Not every correction needs to happen in the public arena. When erroneous or falsified data have been published, then a public correction is appropriate and may be the only way to prevent the propagation of the error. However, sometimes a correction can be made timely to avoid a public error. For example, my wife was reading a paper in her field of research that was available online in “pre-print” form before publication. She noticed an error in the results tables and contacted the primary author privately in case there was time to correct it before others in the field would evaluate and apply the results. Happily, in that situation, the author thanked my wife and confirmed that there was time to correct the error before final publication. Within research groups, we can help each other by evaluating data critically – not to undermine any individual but to help maintain both scientific integrity and the reputations of all involved by sharing the goals of correct results and appropriate interpretation.
However, colleagues and I have also had numerous situations where we’ve pointed out scientific or academic errors or misconduct, and nothing was done. In addition to having letters to journal editors ignored in cases of clear published errors, there is also a battle for integrity in the schools. The absence of negative feedback has the effect of training students in poor behavior early on. We learned of a student texting answers to other students during a science test. The student admitted doing so but refused to name others (recipients of the texts). The department of the North Carolina public school refused to investigate further or attempt to find out who benefited from the cheating. Not even the admitted cheater received any consequence. We’ve seen a pattern of failures in academic and scientific integrity in North Carolina (such as the UNC athlete cheating scandal).
When I taught at the Air Force Academy, things were handled better. Even if the process failed to bring a disciplinary consequence to the student, an academic consequence could be brought by the instructor and department head by meeting a more-likely-than-not standard of evidence. The Math department head always supported a teacher’s recommendation of a zero for cheating on any graded event.
When I ran a cadet research program, I terminated cadet participation in the research program immediately and permanently when it became clear that a student had faked data or otherwise committed academic dishonesty. Even when a superior (not in the math department) recommended a gentler approach to allow for a “learning experience,” I terminated participation in the program, because I thought a firmer response was needed to bring the lesson home and protect the integrity of the program.
I have a sharp eye for data, and I run several statistical and common-sense checks on student data and analysis. I may be the only professor I know who repeats student analysis at every step in most projects under my supervision. I have developed a good sense of what “too good to be true” looks like and what kinds of uncertainties can be expected given the experimental conditions and sample sizes. In my mentoring of science projects, students know from the beginning that I have zero tolerance for violations of academic and scientific integrity and that I am double-checking their data and analysis closely.
It is interesting to note that the original article cites Ernst Haeckel but fails to note his well-known fraudulent embryo drawings. I recall stirring up controversy in a guest lecture to a biology class in the last decade by pointing out their modern-day textbook was still using the errant Haeckel drawings. The drawings and the associated recapitulation theory have been considered in error for over 100 years, so it is something of a mystery how they can appear in modern textbooks without hordes of teachers and scientists objecting.
If you teach laboratories, what consideration have you given to making it harder for students to fake data? I mentor several students on ISEF-type research projects and undergraduate research, so I get their feedback frequently on how their lab science classes are going. Some of their teachers are getting out in front of scientific integrity by designing lab experiments with an auditable data path from the original execution of the experiment to the graded lab report. This approach is analogous to the requirements some journals and funding agencies have that data be published in a repository. In some cases, lab instructors are even requiring students to take pictures while executing experiments. It’s much harder to fake data if there are time-stamped data files with the original data as well as time-stamped pictures of the experiment in progress. Sure, someone will be smart enough to fool any accountability system, but putting a good system in place keeps students from thinking they somehow have tacit approval to manufacture data because they don’t just need to fake the data, they need to intentionally subvert the accountability system.
It’s too easy to blame the government. They have entrusted matters of academic and scientific integrity to the diligence of teachers and scientists. We should all be doing our part in our respective areas of work to maintain integrity. How many scientists and students have you busted in the past decade?
I grew up working in bars and restaurants in New Orleans and viewed education as a path to escape menial and dangerous work environments, majoring in Physics at LSU. After being a finalist for the Rhodes Scholarship I was offered graduate research fellowships from both Princeton and MIT, completing a PhD in Physics from MIT in 1995. I have published papers in theoretical astrophysics, experimental atomic physics, chaos theory, quantum theory, acoustics, ballistics, traumatic brain injury, epistemology, and education.
My philosophy of education emphasizes the tremendous potential for accomplishment in each individual and that achieving that potential requires efforts in a broad range of disciplines including music, art, poetry, history, literature, science, math, and athletics. As a younger man, I enjoyed playing basketball and Ultimate. Now I play tennis and mountain bike 2000 miles a year.
Leave a Reply
Want to join the discussion?Feel free to contribute!