High School Multiply Probabilities vs. Sum of the Squares

Click For Summary
SUMMARY

The discussion clarifies the distinction between multiplying probabilities and using the sum of the squares in the context of uncorrelated probabilistic events. When calculating the probability of two independent events both occurring beyond a certain threshold, such as 3.3 sigma, the correct approach is to multiply the probabilities (e.g., 0.001 * 0.001). The sum of the squares is not applicable for probabilities but is relevant when calculating the variance of the sum of two random variables, where the standard deviations are squared and summed.

PREREQUISITES
  • Understanding of probability theory, specifically independent and uncorrelated events
  • Familiarity with statistical concepts such as sigma levels and standard deviation
  • Knowledge of variance calculations in statistics
  • Basic mathematical skills for manipulating probabilities and statistical formulas
NEXT STEPS
  • Study the principles of independent versus uncorrelated events in probability theory
  • Learn about variance and standard deviation calculations in statistics
  • Explore applications of sigma levels in statistical analysis
  • Investigate the implications of correlation in probabilistic models
USEFUL FOR

Statisticians, data analysts, and anyone involved in probability theory or statistical modeling who seeks to understand the nuances of calculating probabilities for independent and uncorrelated events.

jaydnul
Messages
558
Reaction score
15
Hi! I'm getting confused by these two things. If I have two uncorrelated probabilistic events, and I want to know the probability of seeing them both land beyond 3.3 sigma (for example), do I multiply the probabilities .001*.001 or do I do sum of the squares sqrt(.001^2 + .001^2). I assume it is the former, but can you explain what context we would use sum of the squares instead?
 
Physics news on Phys.org
jaydnul said:
If I have two uncorrelated probabilistic events, and I want to know the probability of seeing them both land beyond 3.3 sigma (for example), do I multiply the probabilities .001*.001 or do I do sum of the squares sqrt(.001^2 + .001^2).
Uncorrelated and independent are different things. If they are independent then you multiply the probabilities. Correlation is simply one specific type of dependence. However, it is easy to come up with examples where one event causes the other with 100% certainty (so the probability of seeing them both is equal to the probability of the cause) and yet they are uncorrelated.

jaydnul said:
explain what context we would use sum of the squares instead?
I don't know of a context where you use the sum of the squares of the probabilities. That could easily lead to a number greater than 1, which couldn’t be a probability.

Often you use the sum of the squares of the standard deviations. For example, to calculate the variance of the sum of two random variables.
 
Last edited:
  • Like
Likes PeroK and FactChecker
Ok thanks that makes sense!
 
If there are an infinite number of natural numbers, and an infinite number of fractions in between any two natural numbers, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and... then that must mean that there are not only infinite infinities, but an infinite number of those infinities. and an infinite number of those...

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 57 ·
2
Replies
57
Views
6K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K