Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Science Vulnerability to Bugs

  1. Jul 15, 2016 #1

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    I'm reposting this here because it raises a fascinating kind of vulnerability (new to me). It is a vulnerabililty of any science using software for analysis to a common-mode error. The thought that 40,000 scientific teams were fooled is shocking. It is a good post, it links sources both supporting and opposing the conclusions.

    I would think that this issue supports mandating that the raw data of all scientific studies should be open sourced and archived publicly. That way, the data could be re-processed in the future when improved (or corrected) tools become available, and published conclusions could be automatically updated or automatically deprecated.
     
  2. jcsd
  3. Jul 15, 2016 #2

    jedishrfu

    Staff: Mentor

    Yes, this is what is done at some professional labs, especially where new models of analysis are being developed and you have the need to compare the existing model with the newer faster/better one.

    However, bugs such as this are a very difficult problem to uncover with the Intel Pentium bug as a notable example. A bug can be in the sensor electronics used to measure something, or in the processor hardware, or faulty memory/storage, or firmware or driver, or library software or the application itself. At each level, testing is done with varying levels of coverage with the final application being the least tested.

    It also brings back the notion that everything that we do is essentially a house of cards and I agree we need to prepare for the evitable with backups of key data or risk having to rerun an experiment.

    More on the Pentium bug:

    https://en.wikipedia.org/wiki/Pentium_FDIV_bug

    https://www.cs.earlham.edu/~dusko/cs63/fdiv.html
     
    Last edited by a moderator: Jul 15, 2016
  4. Jul 15, 2016 #3

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    It is easy to visualize (but perhaps hard to implement) a system where all papers published digitally contain links pointing to their dependencies. Citations of prior work, and links to hardware and software dependencies for example. Thereafter, there would be two ways to go.
    1. Using bidirectional links: When the object depended on changes, reverse links can notify the authors of all dependent papers.

    2. Using unidirectional links: When a work is called up for viewing, the work's dependency links can be checked. If they are found to point to a retracted or revised or deleted object, then the viewer of the dependent work can be warned. Links can descend recursively to the bottom. The viewer gold standard would be to refuse to read or cite any paper with less than flawless dependencies. If that proves too bothersome, viewers could use a bronze or a lead standard along a continuum of choices.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Science Vulnerability to Bugs
  1. Bug in Firefox (Replies: 3)

  2. Vista bug/question (Replies: 4)

  3. Avant gmail bug (Replies: 3)

Loading...