- #1

- 118

- 0

Thanks

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter stukbv
- Start date

- #1

- 118

- 0

Thanks

- #2

mathman

Science Advisor

- 7,942

- 496

The Gaussian assumption is used only to get the fact that the sum is also Gaussian.

- #3

- 118

- 0

- #4

Stephen Tashi

Science Advisor

- 7,664

- 1,500

What I mean is, im asked to show that the sum of 2 independent random variables each with the same distribution has that distribution again, i do this using the product of the mgfs

You don't mean that the sum of 2 random variables has "has that distribution again". You mean that the sum is in the same "family" of distributions as the random variables that were added. (Whether it is or not, will depend on how you define the "family". There is no requirement that a "family" of distributions be defined by parameters, although this is the usual way of doing it.)

I don't think that, in general, that you can prove such a result using an argument based on the product of moment generating functions. Two random variables with different probability distributions can have the same moment generating function. You would need to show that for the "family" of distributions in question, the product of the moment generating functions could only be the moment generating function of a distribution in that family.

Try using characteristic functions instead of moment generating functions.

- #5

- 22,089

- 3,297

Two random variables with different probability distributions can have the same moment generating function.

Really? I saw once that the mgf unique determines the distribution, provided that all the moments exist. Maybe I recall it wrong, I'll search for the reference...

- #6

- 22,089

- 3,297

If P is a probability distribution on the real line having finite moments [itex]\alpha_k[/itex]of all orders and if the power series

[tex]\sum_{k=0}^{+\infty}{\frac{\alpha_kx^k}{k!}}[/tex]

has a positive radius of convergence, then P is the unique probability distribution with [itex]\alpha_k[/itex] as it's moments.

[tex]\sum_{k=0}^{+\infty}{\frac{\alpha_kx^k}{k!}}[/tex]

has a positive radius of convergence, then P is the unique probability distribution with [itex]\alpha_k[/itex] as it's moments.

In particular, the mgf determines the distribution uniquely. The only problem is that not every random variable has an mgf, that's why one uses the characteristic function.

- #7

Stephen Tashi

Science Advisor

- 7,664

- 1,500

You are correct. I was confusing the question of the uniqueness of the moment generating function (when it exists) with the fact that two distributions can have the same moments and not be the same distribution ( in a case where the moment generating function does not converge).

- #8

mathman

Science Advisor

- 7,942

- 496

This is one reason why characteristic function is preferred to moment generating function. It always exists.

You are correct. I was confusing the question of the uniqueness of the moment generating function (when it exists) with the fact that two distributions can have the same moments and not be the same distribution ( in a case where the moment generating function does not converge).

- #9

- 28

- 0

I'll provide you with a framework using the gamma distribution.

[tex]X ~ is ~ Gamma(\alpha_1, \beta); Y ~ is~ Gamma(\alpha_2, \beta) [/tex]

[tex]

E[e^{t(X+Y)}] = E[e^{tX}]*E[e^{tY}] = (\frac{\beta}{\beta - t})^{\alpha_1} (\frac{\beta}{\beta - t})^{\alpha_2} = (\frac{\beta}{\beta - t})^{\alpha_1 + \alpha_2}

[/tex]

This matches the MGF of a gamma random variable with new alpha:

[tex]\alpha ' = \alpha_1 + \alpha_2[/tex]

Therefore X + Y ~ gamma(alpha1 + alpha2, beta). Basically just show that the resulting MGF is the same MGF as before with new parameters.

Share: