High School Multiply Probabilities vs. Sum of the Squares

Click For Summary
When calculating the probability of two uncorrelated probabilistic events occurring beyond a certain threshold, the correct approach is to multiply their probabilities if they are independent. The sum of the squares is not applicable for calculating probabilities directly, as it could yield a result greater than 1, which is not valid for probabilities. Instead, the sum of the squares is typically used in the context of standard deviations to calculate the variance of the sum of two random variables. Understanding the distinction between uncorrelated and independent events is crucial in these calculations. Thus, for independent events, always multiply the probabilities to find the joint probability.
jaydnul
Messages
558
Reaction score
15
Hi! I'm getting confused by these two things. If I have two uncorrelated probabilistic events, and I want to know the probability of seeing them both land beyond 3.3 sigma (for example), do I multiply the probabilities .001*.001 or do I do sum of the squares sqrt(.001^2 + .001^2). I assume it is the former, but can you explain what context we would use sum of the squares instead?
 
Physics news on Phys.org
jaydnul said:
If I have two uncorrelated probabilistic events, and I want to know the probability of seeing them both land beyond 3.3 sigma (for example), do I multiply the probabilities .001*.001 or do I do sum of the squares sqrt(.001^2 + .001^2).
Uncorrelated and independent are different things. If they are independent then you multiply the probabilities. Correlation is simply one specific type of dependence. However, it is easy to come up with examples where one event causes the other with 100% certainty (so the probability of seeing them both is equal to the probability of the cause) and yet they are uncorrelated.

jaydnul said:
explain what context we would use sum of the squares instead?
I don't know of a context where you use the sum of the squares of the probabilities. That could easily lead to a number greater than 1, which couldn’t be a probability.

Often you use the sum of the squares of the standard deviations. For example, to calculate the variance of the sum of two random variables.
 
Last edited:
  • Like
Likes PeroK and FactChecker
Ok thanks that makes sense!
 
Greetings, I am studying probability theory [non-measure theory] from a textbook. I stumbled to the topic stating that Cauchy Distribution has no moments. It was not proved, and I tried working it via direct calculation of the improper integral of E[X^n] for the case n=1. Anyhow, I wanted to generalize this without success. I stumbled upon this thread here: https://www.physicsforums.com/threads/how-to-prove-the-cauchy-distribution-has-no-moments.992416/ I really enjoyed the proof...

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 57 ·
2
Replies
57
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 9 ·
Replies
9
Views
6K