## Main Question or Discussion Point

Hi all,

I am having trouble finding information on a certain problem.

Consider you have a probability that $x_1 = x_a$ (in my case, the probability distribution is a normal distribution centred about 0). So:

$dp(x_1=x_a) = P(x_a) dx_a$

Also consider you have a second variable, $x_2$ for which you have:

$dp(x_2 = x_b) = P(x_b) dx_b$

So we know the probabilities of $x_1$ being $x_a$ and the probability of $x_2$ being $x_b$.

Now, what is the probability that $x_1 + x_2 = x_a + x_b = x_{tot}$?

My first thought was to double integrate dp(x_1=x_a)*dp(x_2 = x_b) with limits from -inf to +inf in both cases, but I think this will overestimate the probability.

Related Set Theory, Logic, Probability, Statistics News on Phys.org
mathman
In general, the prob. density of the sum of two independent random variables is given by the convolution of the densities of the individual r,v,'s.

If they are both normal, then the distribution of the sum is normal with mean=sum of means and variance=sum of variances.

Ok, I've found that if the mean and variance of the two distributions are the same then one simply puts the distribution\s characteristic function to the power of N, where N is the number of distributions you wish to sum, and then taken the inverse FT of this with Fourier parameters a=b=1.

What if you have N normal distributions of identical mean=0 but different standard deviations? It would appear that this method would no longer work...?

As mathman said, the sum of two normal r.v. is again normal, and the same holds by induction for any finite sum of normal r.v.

mathman
Ok, I've found that if the mean and variance of the two distributions are the same then one simply puts the distribution\s characteristic function to the power of N, where N is the number of distributions you wish to sum, and then taken the inverse FT of this with Fourier parameters a=b=1.

What if you have N normal distributions of identical mean=0 but different standard deviations? It would appear that this method would no longer work...?
the general form of a normal c.f. is exp(-imt-vt2), where m is the mean and v the variance. So when you multipy the c.f.'s, you can see immediately the means and variances add.

Ok thanks...What about more non-normal distributions? For example a chi-squared?

mathman