Multiplying Normal Distributions: Rules & Examples

Click For Summary
Multiplying two independent normal distributions does not yield a normal distribution; instead, the sum of two independent normal variables results in another normal distribution. Specifically, if S ~ N(0, 3^2) and D ~ N(0, 2^2), the sum S + D has a mean equal to the sum of their means and a variance equal to the sum of their variances. The initial confusion arose from misinterpreting the notation, where P(S) + P(D) was mistakenly equated to P(S)P(D). For independent events, the joint probability is correctly expressed as P(A & B) = P(A)P(B). Understanding the distinction between summing and multiplying distributions is crucial for accurate calculations.
chota
Messages
22
Reaction score
0
Hi say I have two "independent" Normal distributions,

S ~ N(0,3^2) and D~(0,2^2)

since I know that S and D are indpendent then

P(S ) + P(D) = P(S)P(D)

however we know they are both normal distributed so I amm just wondering what the general rule is for multiplying two normal distributions
thanks
 
Mathematics news on Phys.org
I'm not sure what you mean by

<br /> P(S) + P(D) = P(S) P(D)<br />

Are you trying to say that when normal random variables are added, the resulting random variable is their product? Not true.

If

<br /> \begin{align*}<br /> S &amp; \sim n(\mu_S, \sigma^2_S)\\<br /> D &amp; \sim n(\mu_D, \sigma^2_D)<br /> \end{align*}<br />

and they are independent, then the sum S + D is normal, with mean

<br /> \mu_S + \mu_D<br />

and variance

<br /> \sigma^2_S + \sigma^2_D<br />

A similar result is true even if the two variables have non-zero correlation (the formula for the variance of the sum involves the correlation).

If by 'product' P(S) P(D) you mean the convolution of the distributions, you could go through that work, but it leads you to the same result I quoted above.
 
chota said:
... since I know that S and D are indpendent then

P(S ) + P(D) = P(S)P(D)

I'm guessing you meant to say

P(S & D) = P(S)P(D)

where "S" here really means a statement along the lines of "S lies between A and B", and similarly for "D".
 
For events A and B, normally distributed or not, P(A&B)= P(A)P(B|A)= P(B)P(B|A) where P(A|B) and P(B|A) are the "conditional probabilities" : P(A|B) is "the probability that A will happen given that B happened" and P(B|A) is "the probability that B will happen given that A happened".

IF the A and B are independent then P(A|B)= P(A) and P(B|A)= P(B) so you just multiply the separate probabilities. If they are not independent, just knowing the probabilities of each separately is not enough. You must know at least one of P(A|B), P(B|A) or P(A&B) separately from the individual probabilities.
 
I answered as I did because

  • the OP used S, D in his notation, and I took these as the names of the random variables rather than any interval or event.
  • I took the question to mean he was asking how to combine normal distributions rather than calculate any particular probability
 
Here is a little puzzle from the book 100 Geometric Games by Pierre Berloquin. The side of a small square is one meter long and the side of a larger square one and a half meters long. One vertex of the large square is at the center of the small square. The side of the large square cuts two sides of the small square into one- third parts and two-thirds parts. What is the area where the squares overlap?

Similar threads

  • · Replies 24 ·
Replies
24
Views
4K
Replies
5
Views
2K
Replies
2
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
885
  • · Replies 3 ·
Replies
3
Views
1K