Comparing two hypotheses with uncertainties

  • Thread starter Thread starter jlicht
  • Start date Start date
  • Tags Tags
    Uncertainties
jlicht
Messages
3
Reaction score
0
Hi there,

I'm trying to compare two hypotheses, each with a mean and an error (in this case Gaussian as well), to tell the probability of one being correct given the other.
Currently, the only thing I know how to do is calculate the Gaussian probability of the second hypothesis being correct according to the error of the first by integration. However, this doesn't take the error of the second distribution into account.

What would be the correct way of determining whether one (let's say Gaussian) hypothesis is correct given another, taking into account both uncertainties?

Cheers,
Johannes

EDIT:
As an application example, let us say I measured the gravitational acceleration to be 15.0 ± 0.3, while the known value was 9.8 ± 0.1. What would the significance of my newly measured result be, given the uncertainty on both numbers?
 
Last edited:
Physics news on Phys.org
As an application example, let us say I measured the gravitational acceleration to be 15.0 ± 0.3, while the known value was 9.8 ± 0.1. What would the significance of my newly measured result be, given the uncertainty on both numbers?

If I was doing this measurement, I would check my instruments.
 
mathman said:
If I was doing this measurement, I would check my instruments.

That is obviously not the point; replace g by something weird and unknown called x in the above, then.
 
If you have a quantity with known value ~ 9.8 and you measured it as ~ 15.0, it seems that either the known value was incorrect or the measurement was faulty. They are too far apart (> ~ 15σ) to be purely by chance.
 
mathman said:
If you have a quantity with known value ~ 9.8 and you measured it as ~ 15.0, it seems that either the known value was incorrect or the measurement was faulty. They are too far apart (> ~ 15σ) to be purely by chance.

Now here's my point: you're computing this based on the uncertainty on one of the values. But this completely ignores the uncertainty on the other value. Isn't there a way to include both in a comparison?
 
It is the square root of the sum of the squares, assuming the uncertainties are independent. The net = .316, so qualitatively there is no change.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.

Similar threads

Replies
19
Views
2K
Replies
2
Views
2K
Replies
4
Views
2K
Replies
2
Views
2K
Replies
7
Views
2K
Back
Top