Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How is percentage uncertainty different from standard deviation

  1. Jun 14, 2012 #1
    How is percentage uncertainty different from standard deviation ? I have five measurements and l calculated the average, standard deviation and variance. Do l need to calculate the percentage uncertainty ? Does percentage uncertainty give me any more information which the other values l calculated above doesn't give me ?
     
  2. jcsd
  3. Jun 15, 2012 #2

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    How would you define percentage uncertainty? In some cases you might know that a distribution is bounded. E.g. a measurement error might have a uniform distribution in a known range, so in that case you could quote an error range, as percentage or otherwise.
    Or if you define uncertainty as some number of standard deviations, I suppose you could divide that by the mean. However, the mean you divide by itself has uncertainty, so this might not be the least biased estimate.
     
  4. Jun 15, 2012 #3

    I like Serena

    User Avatar
    Homework Helper

    Usually you would measure and report an average and a standard deviation.
    From these you can derive all others.

    The reason to calculate a variance is to use it in follow-up calculations.
    When you add numbers, you also have to add their variances to find the new variance.
    You should not add standard deviations.

    The reason to calculate a percentage uncertainty is:
    1. To get a sense how accurate the measurement is.
    As a rule of thumb an uncertainty of less than 2% is deemed negligible.
    2. To use in follow-up calculations.
    When you multiply with a number, the uncertainty of the product is that number multiplied by the percentage uncertainty.
     
  5. Jun 15, 2012 #4

    chiro

    User Avatar
    Science Advisor

    Following on from I Like Serena's reply, variance is a natural measure in probability and statistics, and it turns out that you get nice properties with respect to general forms of analyzing uncertainty by using variances in the way they are defined.

    One example is adding variances for independent random variables VAR[X+Y] = VAR[X] + VAR[Y] and COV(X,X) = VAR[X].

    Plus you get all kinds of nice things, especially with normal distributions (where the variance/standard deviation is a natural parameter).

    There are other things but this gives you an idea of why it is useful.
     
  6. Jun 15, 2012 #5
    So l guess the variance is good enough. After adding up the variances l just end up calculating the standard deviation and l will use that as the error in my equipment.
     
  7. Jun 15, 2012 #6

    I like Serena

    User Avatar
    Homework Helper

    Sounds good. :)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: How is percentage uncertainty different from standard deviation
Loading...