Undergraduate interest in research is a good thing; it’s even better if they aspire to publish their work for review and consideration from a broader audience. First, we should consider what it means to be publishable.
Usually, “publishable” means a paper contains a novel and interesting result in either theory or experiment that is more likely than not to be correct.
“Novel” is a bit easier to understand objectively: it means the same result has not been published previously.
“Interesting” is more subjective. Often in the search for “novel,” scientists (including undergrads) go off into the weeds, because accessible theory and experiments that have not been previously published are more likely in areas where no one has cared enough to work very hard. This tends to make them less “interesting.”
Undergrads struggle with research ideas, because they often tend to assume their work needs to be within the domain of new fundamental science of the sort that would be suitable for the Physical Review, when often their skills and scientific maturity have not yet really empowered them for that level of contribution. As mentors of a lot of undergrad (and high school) research, we’ve found that there are several other niches that work well with the skill sets and scientific maturity more common among undergraduates (and high school students):
Inventing new instruments and techniques (or revisiting usefulness of existing ones with faster/cheaper technology)
Device for Underwater Laboratory Simulation of Unconfined Blast Waves
One of our students made a veritable cottage industry from inventing new, inexpensive laboratory devices for conducting blast wave research, and he’ll enter college in the fall with 6 peer-reviewed papers. The key here is to find a field where the development of new instruments is within the capabilities of the student.
Explicit numerical integration was too slow to be interesting or useful for computing Fourier transforms in the first decades after computers came into wide use, but this paper asked the question whether there was any advantage to revisiting an abandoned technique now that computers were fast enough to do it “the old fashioned way.” There is lots of room for an experimental/computational approach investigating benefits of different computational approaches that may be simple tweaks on tried and true methods or methods that were left behind in the past as too slow for the computers of that generation.
A More Accurate Fourier Transform
I think there will always be a niche for achieving comparable measurement accuracy with instrumentation that is 1% to 10% of the cost of the state of the art equipment used by well-funded world-class operations. This paper demonstrates an approach to measuring drag coefficients with inexpensive hobbyist equipment.
A rifle is one of the simplest forms of internal combustion engine, and measuring the friction of a bullet in the barrel was one of the outstanding unsolved problems in internal ballistics for many decades. This method was discovered in a student project by reanalyzing data acquired in a different project on bullet stability.
The problem of predicting bullet penetration and tissue damage from first principles is one of the most important unsolved problems in terminal ballistics. Addressing those issues is beyond the skill set of most undergraduates, but development of experimental methods to collect the data for testing and refining new theoretical approaches is well within their abilities – combining first year physics and calculus in very practical ways.
Novel experiments that are interesting because of environmental applications
Environmental applications make any project more interesting. The strongest trend in ballistics in the last 20 years is getting the lead out to reduce human exposures and environmental impacts. The industry has lots of promising new technologies and products, and results of well designed independent testing will always get their fair share of attention.
Researchers made a big splash a few years ago demonstrating the effectiveness of magnetic hooks in reducing shark bycatch. This result had potential applications in longline fishing, but depended on the unproven assumption that magnets would affect the catch rate of elasmobranches (sharks and rays) but not teleosts (bony fishes). The experiment was simple: find a magnet whose field was comparable to the earth’s magnetic field at 0.5 m, attach it to some hooks, go fishing, and record the catch rates on the magnetic hooks and a non-magnetic control group. Lots of species left to test …
Novel experiments that are interesting because of educational applications
The potential research projects here run the spectrum from new and interesting possible undergraduate laboratories to testing the specifications of lab research gear to just having fun. This project was born when the student realized a transparent tube and a high speed video camera could accurately measure what happens inside of potato cannons.
This project was born when a student realized a high speed video camera was a great tool for measuring deflagration velocities. With the wider availability and lower prices of high speed cameras, there are a lot of other fuel-air and oxy-fuel combinations that can be tested in a wide array of geometries. In additional to the educational interest, there are important questions in chemical kinetics and deflagration to detonation transitions that can be answered with relatively simple experiments.
High speed video is a great experimental tool, because it allows accurate determination of position vs. time of a visual signal. But there are many cases where the kinematics can be determined with a microphone and appropriate sound signal. Think of what Galileo could have done with a sound card.
Finding mistakes in published papers and writing comments pointing them out
Unless a student has really keen error detection skills and voluminous reading habits, they may not catch many mistakes in published papers. But most well-read faculty members and active scientists know where the bodies are buried and can point students to the low hanging fruit. This paper responds to a “peer-reviewed” paper with both fundamental errors in the statistics as well as exaggerated claims in the abstract.
My wife found the original errors when mentoring a student on a related project. We didn’t have the time to follow up on it at the time, so we set it aside until we had students looking for a project. They did a great job tracking down and documenting the mistakes. High school math and common sense was all that was required.
If you see something, say something – or find a student to track down the details and write up the paper. I don’t know what bothers me more, the original mistake, or the fact that it still has not been fixed nearly four YEARS after we pointed it out. We can “March for Science” until we are blue in the face, but if science cannot really demonstrate the “self-correction” that it claims, large segments of the public will remain justified in “denying” claims from parties that prove themselves less than trustworthy.
When public policy is concerned, there is often something of a shell game where the predictions get more press than a retrospective comparison between predictions and experiments. Since the publication of this student paper, I am happy to report that the scientists making their predictions have improved their models and the resulting accuracy of their predictions.
Review/hypothesis papers bringing together different fields that are related, but not well connected in the literature
The ideas for these papers require a broad knowledge of the literature in related fields, suggesting the genesis of these hypotheses more likely lies with faculty mentors than with students. But many (perhaps even most) active researchers have some hypotheses swirling around in the back of their minds that a bit of student prodding can bring forward. Many faculty have neither time nor interest to do all the literature work and writing to produce a coherent paper, but they tend to be happy enough when free labor appears and is willing to do the harder work in response to some brainstorming sessions, throwing some logical diagrams and outlines on a white board, and other legwork to get hypotheses and under appreciated connections into print for colleagues to consider.
Our paper on nutrient loading and red snapper production was born from reading about the red snapper management debate combined with closely following the literature relating to “dead zones” in the Gulf of Mexico purportedly resulting from nutrient loading of farm fertilizers flowing down the Mississippi River. We were perplexed, because fishing on Louisana’s Gulf coast is among the best in the United States, and we knew from first hand experience that those waters are teeming with life and anything but a “dead zone.” The light bulb went on when reading a thesis from LSU that observed fisheries biomass was highly correlated with proximity to the mouth of the Mississippi River.
Our magnetic shark deterrent paper is just one of several cases where biologists had stalled in progress due to insufficient ongoing mastery of freshman physics. Consequently, the literature had one basic hypothesis (sharks can detect magnets), but experimental designs to distinguish possible magnetoreception mechanisms were lacking, and papers favoring one mechanism over another were more the result of the confirmation biases of the authors that clear supporting or exclusionary experimental evidence. There was plenty of room for a good undergraduate physics student to articulate the strongest contenders: electromagnetic induction in salt water, direct magnetoreception with macroscopic magnetic materials, and direct magnetoreception with microscopic magnetic materials (biogenic magnetite). We didn’t solve the problem, but at least we articulated the choices clearly to better support improved experimental designs. (For example, dependence on conductivity or current speeds would favor electromagnetic induction.)
Testing products to compare measured values with product specifications
There may be greater interest than we have explored in testing whether laboratory equipment meets its product specifications. Is the thermometer, balance, voltmeter, power meter, spectrum analyzer, etc. as accurate as is claimed? Every sensor in the Vernier catalog is a potential project. Most attentive instructors in freshman physics labs will have a good idea what gear is as accurate as claimed, and what gear falls short. We’ve take a bit different approach, testing products marketed to hobbyists, sportsmen, military, and law enforcement.
Testing validity of commonly used equations with little published data supporting how they are used
Most equations in science have some area of applicability where they have been validated as accurate. But over time, usage often expands far beyond the “fine print” relating to the assumptions and conditions where the equations are valid. Experimental tests of these equations to explore their validity in areas of ongoing application can be of great interest. A great habit of mind for any scientist (student or not) is to ask, “Where is the data supporting that well-known equation, and does the data really support the broad usage of how that equation is commonly employed?”
Some models reach widespread use because they are easy to apply and available rather than because they are rigorous or consistently make accurate predictions. In the rush to provide some estimate, people with more engineering than scientific mindsets will use the best equation available and run with it rather than giving due consideration to the validation and expected accuracy. When we started blast research in 2007, we noticed wide use of the acoustic impedance model of blast wave transmission was in designing protective equipment and interpreting experiments. We could find no data supporting how it was being used, so we designed and executed appropriate experiments in both air blast and underwater blast.
Experimental Test of the Acoustic-Impedance Model for Underwater Blast Wave Transmission through Plate Materials
Ideas for the next two student papers came from considering experimental possibilities to make use of regular travel between Colorado (high elevation > 7000 ft) and Louisiana (sea level). Students brainstormed ways to conduct one day experiments with existing equipment so they ended up doing scholarly searches for formulas purporting to predict effects relating to air pressure, air density, or altitude. There are lots of formulas predicting these effects, the challenge was to find examples that were under supported with experimental data.
The formula suggesting the proportionality of aerodynamic drag to air density is in nearly every introductory physics textbook. But many fail to clarify that it is well known that the drag coefficient is not constant, but depends on Reynolds number, which, in turn, depends on air density. The assertion of independence of drag coefficient from air density for supersonic projectiles is more subtle – being buried in formulas published by Robert McCoy of the Army Ballistics Research Laboratory at Aberdeen Proving Ground in Maryland. Digging into the experimental support for those formulas showed that the data really supported independence of drag coefficient from Reynolds number (over the applicable range of projectile diameters, not in an absolute sense), and independence from Reynolds number was used to infer independence from air density. However, all the supporting experimental data was gathered at the sea level facilities in Maryland, so the air density range was really too limited to provide direct support for independence of the drag coefficient from air density.
The formula for the dependence of rocket motor thrust on ambient pressure was found at a NASA web site. Due diligence in literature searches suggested an absence of published experimental support for the formula. No literature search is perfect, so it was unclear if there was supporting data published in an obscure book or journal inaccessible from the available libraries and search engines, or if there was supporting data sitting unpublished but available do NASA and Department of Defense rocket scientists. In either case, the difficulty in finding supporting data motivated a nice experiment that could be conducted with relatively inexpensive hobbyist rocket motors.
This is probably the niche that requires the most background work and guidance from a mentor to identify, because the idea to test how the formula is being used usually originates with the recognition of an ABSENCE of supporting data. Gaining confidence that there is really an absence of supporting data in the literature requires an extremely thorough background literature search. But note that in 3 or 4 of the cases above, the new (and relatively simple) experimental result showed that the application of the well-known formula was inappropriate. Formulas without supporting data are wrong a lot of the time.
Our niches are unlikely to produce significant advances in FUNDAMENTAL physics. The skills and resources for significant advances in FUNDAMENTAL physics are often outside of the scope of abilities of undergrads. But there is a lot of good and solid science to be done in the niches we find useful. Most of the discussion among my physics colleagues would not center on whether these papers are “publishable” (since they are all published), but on whether they are “physics” of the sort suitable for undergrad research. Each institution sets their own standards on that. But one can certainly have some fun and accomplish some solid research with a Myth Busters mindset.