How is percentage uncertainty different from standard deviation

In summary, percentage uncertainty is a measure of how accurate a measurement is. It can be used to help in follow-up calculations.
  • #1
Nyasha
127
0
How is percentage uncertainty different from standard deviation ? I have five measurements and l calculated the average, standard deviation and variance. Do l need to calculate the percentage uncertainty ? Does percentage uncertainty give me any more information which the other values l calculated above doesn't give me ?
 
Physics news on Phys.org
  • #2
How would you define percentage uncertainty? In some cases you might know that a distribution is bounded. E.g. a measurement error might have a uniform distribution in a known range, so in that case you could quote an error range, as percentage or otherwise.
Or if you define uncertainty as some number of standard deviations, I suppose you could divide that by the mean. However, the mean you divide by itself has uncertainty, so this might not be the least biased estimate.
 
  • #3
Usually you would measure and report an average and a standard deviation.
From these you can derive all others.

The reason to calculate a variance is to use it in follow-up calculations.
When you add numbers, you also have to add their variances to find the new variance.
You should not add standard deviations.

The reason to calculate a percentage uncertainty is:
1. To get a sense how accurate the measurement is.
As a rule of thumb an uncertainty of less than 2% is deemed negligible.
2. To use in follow-up calculations.
When you multiply with a number, the uncertainty of the product is that number multiplied by the percentage uncertainty.
 
  • #4
Following on from I Like Serena's reply, variance is a natural measure in probability and statistics, and it turns out that you get nice properties with respect to general forms of analyzing uncertainty by using variances in the way they are defined.

One example is adding variances for independent random variables VAR[X+Y] = VAR[X] + VAR[Y] and COV(X,X) = VAR[X].

Plus you get all kinds of nice things, especially with normal distributions (where the variance/standard deviation is a natural parameter).

There are other things but this gives you an idea of why it is useful.
 
  • #5
I like Serena said:
Usually you would measure and report an average and a standard deviation.
From these you can derive all others.

The reason to calculate a variance is to use it in follow-up calculations.
When you add numbers, you also have to add their variances to find the new variance.
You should not add standard deviations.

The reason to calculate a percentage uncertainty is:
1. To get a sense how accurate the measurement is.
As a rule of thumb an uncertainty of less than 2% is deemed negligible.
2. To use in follow-up calculations.
When you multiply with a number, the uncertainty of the product is that number multiplied by the percentage uncertainty.

So l guess the variance is good enough. After adding up the variances l just end up calculating the standard deviation and l will use that as the error in my equipment.
 
  • #6
Nyasha said:
So l guess the variance is good enough. After adding up the variances l just end up calculating the standard deviation and l will use that as the error in my equipment.

Sounds good. :)
 

1. What is percentage uncertainty?

Percentage uncertainty is a measure of the amount of error or variability in a set of data. It is usually expressed as a percentage of the mean value of the data.

2. How is percentage uncertainty calculated?

To calculate percentage uncertainty, divide the standard deviation of the data by the mean value, and then multiply by 100. This will give you the percentage uncertainty value.

3. What is the difference between percentage uncertainty and standard deviation?

Percentage uncertainty is a relative measure of error, while standard deviation is an absolute measure of variability. Percentage uncertainty takes into account the size of the data, while standard deviation does not.

4. Why is percentage uncertainty important?

Percentage uncertainty is important because it allows scientists to assess the reliability and accuracy of their data. It also helps in comparing different sets of data and identifying outliers or unusual data points.

5. Can percentage uncertainty and standard deviation have the same value?

Yes, it is possible for percentage uncertainty and standard deviation to have the same value. This would occur when the data is evenly distributed around the mean, with no outliers or unusual data points.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
910
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
21
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
906
  • Set Theory, Logic, Probability, Statistics
Replies
24
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
818
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
791
  • Set Theory, Logic, Probability, Statistics
Replies
19
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
3K
Back
Top