Moment generating functions

In summary, the addition of two independent random variables with the same distribution can be shown to have the same distribution through the product of their moment generating functions. This demonstrates the stability under additivity and shows that the parameters of the new distribution are the sum of the original parameters. However, this method may not always hold true as two random variables with different distributions can have the same moment generating function. Using characteristic functions instead can ensure the uniqueness of the distribution. In the case of the gamma distribution, we can use the product of moment generating functions to show that the resulting MGF is equivalent to that of a gamma distribution with new parameters, thus proving that the sum of two independent gamma distributed random variables is also gamma distributed.
  • #1
stukbv
118
0
Hi, when asked to show that the addition of 2 independent RV's with the same distribution is once again of the same distribution, eg showing in gaussian that if X has mean m and variance v and Y has mean n and variance w then if i want to show that X + Y has gaussian distribution with mean m+n and variance w+v I use the product of their m.g.fs which is fine , when i get the result of this product, what is a good way to basically say that this shows the stability under additivity and that the parameters are m+n and w+v ?

Thanks
 
Physics news on Phys.org
  • #2
I don't fully understand your question. However, if X and Y are independent (not necessarily normal), as long as the means and variances exist, they add as you described. You just need to wotk with the first and second moments, where independence is used to give E(XY) = E(X)E(Y) in the variance calculation.

The Gaussian assumption is used only to get the fact that the sum is also Gaussian.
 
  • #3
What I mean is, I am asked to show that the sum of 2 independent random variables each with the same distribution has that distribution again, i do this using the product of the mgfs, but how do i phrase / conclude throughly in words how this proves that the sum of the variables has the same distribution as the original ones?
 
  • #4
stukbv said:
What I mean is, I am asked to show that the sum of 2 independent random variables each with the same distribution has that distribution again, i do this using the product of the mgfs

You don't mean that the sum of 2 random variables has "has that distribution again". You mean that the sum is in the same "family" of distributions as the random variables that were added. (Whether it is or not, will depend on how you define the "family". There is no requirement that a "family" of distributions be defined by parameters, although this is the usual way of doing it.)

I don't think that, in general, that you can prove such a result using an argument based on the product of moment generating functions. Two random variables with different probability distributions can have the same moment generating function. You would need to show that for the "family" of distributions in question, the product of the moment generating functions could only be the moment generating function of a distribution in that family.

Try using characteristic functions instead of moment generating functions.
 
  • #5
Stephen Tashi said:
Two random variables with different probability distributions can have the same moment generating function.

Really? I saw once that the mgf unique determines the distribution, provided that all the moments exist. Maybe I recall it wrong, I'll search for the reference...
 
  • #6
The statement in "Probability and measure" in Billingsley is

If P is a probability distribution on the real line having finite moments [itex]\alpha_k[/itex]of all orders and if the power series
[tex]\sum_{k=0}^{+\infty}{\frac{\alpha_kx^k}{k!}}[/tex]
has a positive radius of convergence, then P is the unique probability distribution with [itex]\alpha_k[/itex] as it's moments.

In particular, the mgf determines the distribution uniquely. The only problem is that not every random variable has an mgf, that's why one uses the characteristic function.
 
  • #7
micromass,

You are correct. I was confusing the question of the uniqueness of the moment generating function (when it exists) with the fact that two distributions can have the same moments and not be the same distribution ( in a case where the moment generating function does not converge).
 
  • #8
Stephen Tashi said:
micromass,

You are correct. I was confusing the question of the uniqueness of the moment generating function (when it exists) with the fact that two distributions can have the same moments and not be the same distribution ( in a case where the moment generating function does not converge).
This is one reason why characteristic function is preferred to moment generating function. It always exists.
 
  • #9
stukbv said:
What I mean is, I am asked to show that the sum of 2 independent random variables each with the same distribution has that distribution again, i do this using the product of the mgfs, but how do i phrase / conclude throughly in words how this proves that the sum of the variables has the same distribution as the original ones?

I'll provide you with a framework using the gamma distribution.

[tex]X ~ is ~ Gamma(\alpha_1, \beta); Y ~ is~ Gamma(\alpha_2, \beta) [/tex]
[tex]

E[e^{t(X+Y)}] = E[e^{tX}]*E[e^{tY}] = (\frac{\beta}{\beta - t})^{\alpha_1} (\frac{\beta}{\beta - t})^{\alpha_2} = (\frac{\beta}{\beta - t})^{\alpha_1 + \alpha_2}

[/tex]

This matches the MGF of a gamma random variable with new alpha:
[tex]\alpha ' = \alpha_1 + \alpha_2[/tex]

Therefore X + Y ~ gamma(alpha1 + alpha2, beta). Basically just show that the resulting MGF is the same MGF as before with new parameters.
 

What is a moment generating function?

A moment generating function is a mathematical function that is used to describe the probability distribution of a random variable. It is defined as the expected value of e^tx, where t is a real number and x is the random variable.

How is a moment generating function different from a probability generating function?

A probability generating function is a special case of a moment generating function, where t is equal to 1. While a moment generating function is used to describe the entire probability distribution of a random variable, a probability generating function is used to describe only the discrete part of the distribution.

What is the purpose of a moment generating function?

The moment generating function is used to calculate moments of a random variable, such as the mean and variance. It also allows for the calculation of higher order moments, which can be useful in certain statistical analyses.

Can moment generating functions be used for all types of random variables?

No, moment generating functions can only be used for random variables that have a finite number of moments. This means that they cannot be used for random variables with an infinite or undefined mean or variance.

How are moment generating functions related to characteristic functions?

Moment generating functions and characteristic functions are closely related and can be used interchangeably in some cases. Both are used to describe the probability distribution of a random variable, but characteristic functions are typically used for continuous distributions while moment generating functions are used for both discrete and continuous distributions.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
429
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
887
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
896
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
925
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
834
Back
Top