Combining probability distribution functions

hermano
Messages
38
Reaction score
0
Hi,

I'm comparing different measurement methods. I listed and derived an equation for each error component per measurement method and calculated the probability distribution using the Monte-Carlo method (calculating each error 300.000 times assuming a normal distribution of the input variable). However, the outcome of an Monte-Carlo simulation is a probability distribution for each error component under study. I want to combine these separate probability distribution functions per error component for each measurement methods to come to an overall probability distribution function such that I can compare the uncertainty of each measurement method. How can I do this? Anybody a good reference?
 
Physics news on Phys.org
hermano said:
Hi,

However, the outcome of an Monte-Carlo simulation is a probability distribution for each error component under study.

What do you mean by "error component". Are you talking about the components of a vector?

I want to combine these separate probability distribution functions per error component for each measurement methods to come to an overall probability distribution function

What do you mean by "combine"? Are the "components" added together like vectors? - or like scalars? - or are they inputs to some non-linear scalar valued function?

such that I can compare the uncertainty of each measurement method. How can I do this? Anybody a good reference?

Does "uncertainty" mean the standard deviation of the measurement? If you simulated the distribution of some errors by Monte-Carlo, why didn't you also simulate the "combination" of these errors?
 
Stephen Tashi said:
What do you mean by "error component". Are you talking about the components of a vector?

With error component I mean the error source. For example, you measure the length of a bar. Then there are different error components/sources (or uncertainty components) which contribute to the total measurement uncertainty such as, the limited resolution of your ruler, the thermal expansion of your ruler under influence of the temperature.

What do you mean by "combine"? Are the "components" added together like vectors? - or like scalars? - or are they inputs to some non-linear scalar valued function?

No, the different error components are 100.000 times calculated using a Monte-Carlo simulation assuming a normal probability distribution for each error (before doing this, an analytical expression is derived for each error component and the width 'a' of the error interval is given). This will give you a vector of 100.000 error values per error component. From these 100.000 values you can calculate the mean error, standard deviation, uncertainty etc. My question is how can I combine these various standard deviations or uncertainties from the different error components to global (total) uncertainty.

Does "uncertainty" mean the standard deviation of the measurement? If you simulated the distribution of some errors by Monte-Carlo, why didn't you also simulate the "combination" of these errors?

No, uncertainty is not the standard deviation. You can calculate the uncertainty from the standard deviation but this is not the same.
Because the errors are independent. I have an analytical expression for each error separate, but no expression for a combination of all the errors together.
 
You didn't explain how the error "components" are to be combined. The example of the ruler suggests that they are added.

And you didn't define what you mean by "uncertainty".
 
Stephen Tashi said:
You didn't explain how the error "components" are to be combined. The example of the ruler suggests that they are added.

And you didn't define what you mean by "uncertainty".

I want to calculate the total uncertainty. So I think you have to add them, but I am not really sure as they are independent. But that is the question of my whole problem. How do I have to "combine" the probability distributions from the various error components?

Uncertainty is the component of a reported value that characterizes the range of values within which the true value is asserted to lie. An uncertainty estimate should address error from all possible effects (both systematic and random) and, therefore, usually is the most appropriate means of expressing the accuracy of results. This is consistent with ISO guidelines.
 
hermano said:
I want to calculate the total uncertainty. So I think you have to add them, but I am not really sure as they are independent.

I think you mean "whether they are independent".

Since you can't describe how the errors "combine", perhaps you should state the details of this problem and perhaps someone can interpret it from that perspective.

But that is the question of my whole problem. How do I have to "combine" the probability distributions from the various error components?

If you can estimate the covariance of the errors, you can estimate the standard deviation of their sum, even if the errors are dependent.

Uncertainty is the component of a reported value that characterizes the range of values within which the true value is asserted to lie. An uncertainty estimate should address error from all possible effects (both systematic and random) and, therefore, usually is the most appropriate means of expressing the accuracy of results. This is consistent with ISO guidelines.

That may be fine for ISO guidelines, but it doesn't define "uncertainty" in mathematical terms. You stated that uncertainty can be calculated from the standard deviation of the distribution of a measurement but you didn't specify how it would be calculated. Is "uncertainty" supposed to be some kind of "confidence interval"?
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top