Right way to account for non-statistical uncertainties

In summary: W. I am not sure what the difference is.For example for the laser intensity (which is actually the power of the laser), the specification is ##20 \pm 0.1## mW. So in the case above, ##I=20## and ##\Delta I =...## mW.
  • #1
BillKet
312
29
Hello! I have data from a spectroscopy experiment of the form ##(\lambda,N)## where ##\lambda## is the laser wavelength and ##N## is the number of photon counts for that given wavelength. The structure of the obtained spectrum is pretty simple, having a few peaks that need to be fitted with a Lorenz distribution. If I account only for the statistical nature of the experiment, the wavelength can be considered fixed (no errors associated to it) while the number of counts would have a Poisson error i.e. ##\sqrt{N}##. Doing the fit like this works well and I get reasonable values (using available fitting packages like the ones in Python for chi-square minimization). However I would like to now add the effects of the non-statistical uncertainties. On the x-axis, they come from the error on the laser frequency, ##\Delta \lambda## and on the y-axis from the uncertainty in the laser intensity ##\Delta I##. For the y-axis we can assume that number of counts is proportional to the intensity so ##\Delta N_I = \frac{\Delta I}{I}N## with ##\Delta I## and ##I## being known. How should I include these error in my analysis? What I am thinking to do is to just add these error to the values I have, so for the x-axis I have for each point ##\lambda \pm \Delta \lambda## and for the y-axis ##N \pm \sqrt{N+(\frac{\Delta I}{I}N)^2}##, where for the counts the first term under the radical is the statistical error (which was there before, too), while the second one is the non-statistical one. So in the end I have a data set with error on both x and y variable. I can also easily fit this using Python, but is my approach correct? Am I accounting the right way for these extra errors? Thank you!
 
Physics news on Phys.org
  • #2
Usually you should square them to get a variance. Then add the variances. (Assuming independence)
 
  • #3
Dale said:
Usually you should square them to get a variance. Then add the variances. (Assuming independence)
Thank you for your reply. What should I square?
 
  • #4
BillKet said:
Thank you for your reply. What should I square?
The standard uncertainty (standard deviation) of the source of uncertainty. For instance, for ##\Delta \lambda## the laser manufacturer probably lists a specification.
 
  • #5
Dale said:
The standard uncertainty (standard deviation) of the source of uncertainty. For instance, for ##\Delta \lambda## the laser manufacturer probably lists a specification.
Sorry, I am a bit confused. Yes, the non-statistical error ##\Delta \lambda## and ##\Delta I## are known, as well as the ##I## itself (the central value). I am not sure what I should add together. For the x-axis ##\Delta \lambda## is the only source of error, shouldn't I keep it just like that i.e. ##\lambda \pm \Delta \lambda##? And for the y-axis I added them in quadrature and took the square root, as in the original post. What am I missing? Thank you!
 
  • #6
BillKet said:
I am not sure what I should add together. For the x-axis Δλ is the only source of error, shouldn't I keep it just like that i.e. λ±Δλ?
I didn’t realize it was the only source if error. So indeed the total uncertainty is ##\sqrt{\Delta \lambda^2+0^2}=\Delta \lambda##
 
  • #7
Dale said:
I didn’t realize it was the only source if error. So indeed the total uncertainty is ##\sqrt{\Delta \lambda^2+0^2}=\Delta \lambda##
And is the one for the y-axis correct, too?
 
  • #8
Is the y-axis data completely independent?
 
  • #9
jim mcnamara said:
Is the y-axis data completely independent?
Do you mean if the counts in each bin are independent of the rest? I think so (not sure what independent would mean here). But the number of counts at a frequency depends on the laser intensity and the laser frequency, so I don't think measurement in a bin can affect another bin. Please let me know if this is not what you meant.
 
  • #10
BillKet said:
Yes, the non-statistical error ##\Delta \lambda## and ##\Delta I## are known,
I think you should clarify this. Are these "uncertainties" or are they known? Do you mean that they were errors in the sense of not controlled but you know what they were or that they really are unknown?
 
  • #11
FactChecker said:
I think you should clarify this. Are these "uncertainties" or are they known? Do you mean that they were errors in the sense of not controlled but you know what they were or that they really are unknown?
They are known. They are provided by the manufacturer of the laser system.
 
  • #12
BillKet said:
They are known. They are provided by the manufacturer of the laser system.
Are they known uncertainties or known error values?
 
  • #13
FactChecker said:
Are they known uncertainties or known error values?
I am not sure I understand the difference between the two. For example for the laser intensity (which is actually the power of the laser), the specification is ##20 \pm 0.1## mW. So in the case above, ##I=20## and ##\Delta I = 0.1##
 
  • #14
BillKet said:
I am not sure I understand the difference between the two. For example for the laser intensity (which is actually the power of the laser), the specification is ##20 \pm 0.1## mW. So in the case above, ##I=20## and ##\Delta I = 0.1##
Then you are using ##\Delta I## to represent a range of possible, but unknown, errors. That is what I was wondering. This is different from a known error. That means that the true error is a [EDIT] random, unknown, variable uncertainty (see @Dale 's like in post #16) and should be treated that way.
 
Last edited:
  • #15
FactChecker said:
Then you are using ##\Delta I## to represent a range of possible, but unknown, errors. That is what I was wondering. This is different from a known error. That means that the true error is a random, unknown, variable and should be treated that way.
So how should I combine it with the statistical ##\sqrt{N}## error? Also could you please explain what you mean by the difference between known uncertainty and known error? I am not sure what you mean by "a range of possible, but unknown, errors". Isn't this what any error is i.e. you know a central value, but the true value can be anywhere within a range given by the error associated to that value (I think I missunderstand what exactly you mean by error in this context). Thank you!
 
  • #16
BillKet said:
I am not sure I understand the difference between the two.
The difference is that one source of uncertainty is evaluated through statistical means and the other is not.
FactChecker said:
Then you are using ΔI to represent a range of possible, but unknown, errors. That is what I was wondering. This is different from a known error. That means that the true error is a random, unknown, variable and should be treated that way.
The correct way to assess and report uncertainty is given by the NIST guide on uncertainty:

https://www.nist.gov/pml/nist-technical-note-1297

Note that the term error is no longer recommended. They recommend using the term uncertainty instead
 
  • Informative
Likes FactChecker
  • #17
Dale said:
The difference is that one source of uncertainty is evaluated through statistical means and the other is not. The correct way to assess and report uncertainty is given by the NIST guide on uncertainty:

https://www.nist.gov/pml/nist-technical-note-1297

Note that the term error is no longer recommended. They recommend using the term uncertainty instead
Thanks for your reply! So I am even more confused now. In the title of my post I used the term "non-statistical uncertainties". Is that the right term to use for my case (for ##\Delta \lambda## and ##\Delta I##)? Also, I thought (and I think I have seen this in all the experimental papers I read), that the statistical errors/uncertainty and the non-statistical ones (which in most papers they are still called systematics) should be added in quadradure, as they are independent i.e. the values for ##\Delta I## is the same no matter how many counts we get in a bin (so no matter the value of the statistical error ##\sqrt{N}##). Also, as far as, I understand this is what it is mentioned in the NIST link you sent me, too. So is the thing I did in the original post correct, or am I still missing something (I feel like I am being told that what I did is not right, but I am not sure what is wrong about it)?
 

1. What are non-statistical uncertainties?

Non-statistical uncertainties refer to uncertainties that cannot be quantitatively measured or predicted using statistical methods. These uncertainties are often caused by factors such as human error, equipment limitations, or external influences.

2. Why is it important to account for non-statistical uncertainties?

Accounting for non-statistical uncertainties is important because it allows for a more accurate and comprehensive understanding of the data and results. Ignoring these uncertainties can lead to biased or incorrect conclusions.

3. How can non-statistical uncertainties be accounted for?

Non-statistical uncertainties can be accounted for by using methods such as sensitivity analysis, error propagation, or Monte Carlo simulations. These methods take into account the potential impact of non-statistical uncertainties on the data and results.

4. Can non-statistical uncertainties be completely eliminated?

No, non-statistical uncertainties cannot be completely eliminated. However, they can be minimized through proper experimental design, careful data collection and analysis, and using appropriate statistical methods.

5. Are non-statistical uncertainties always considered in scientific research?

Unfortunately, non-statistical uncertainties are not always considered in scientific research. However, it is important for scientists to be aware of these uncertainties and to account for them in their work to ensure the validity and reliability of their findings.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
28
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
22
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
18
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
28
Views
3K
Back
Top