Proving Variance of a Random Variable with Moment-Generating Functions

In summary, we are given that Y is a random variable with moment-generating function m(t), and W = aY + b has a moment-generating function of m(at) * e^(tb). By finding the first and second derivatives of W's mgf, we can obtain the mean and variance of W. Using this, we can prove that V(W) = V(Y) * a^2.
  • #1
dmatador
120
1
Suppose that Y is a random variable with moment-generating function m(t) and W = aY + b, with a moment-generating function of m(at) * e^(tb). Prove that V(W) = V(Y) * a^2. I have done an absurd amount of work on this problem, and I know its actual solution doesn't have one and a half pages worth of work needed. I have tried to find the variances separately and also to find the expected values, but this just gave me a big mess of equations and I need some advice.
 
Physics news on Phys.org
  • #2
E(W)= aE(Y) + b
E(W2)=a2E(Y2) + 2abE(Y) + b2
V(W)=E(W2)-(E(W))2=a2(E(Y2)-E(Y)2)=a2V(Y)
 
  • #3
Thanks a lot man. I kept trying to use the moment-generating function for W. This is a big help.
 
  • #4
Are you sure you were not supposed to use the mgf?

[tex]
\begin{align*}
m_W(t) & = e^{tb} m_Y(at) \\
m'_W(t) & = be^{tb} m_Y(at) + ae^{tb} m'_Y(at) \\
\mu_W & = m'_W(0) = b + a\mu_Y
\end{align*}
[/tex]

so the mean of W is [tex] a\mu_Y + b [/tex].

Remember that the second derivative, evaluated at t = 0, is [itex] \sigma^2 + \mu^2 [/tex].

[tex]
\begin{align*}
m'_W(t) & = (bm_Y(at) + am'_Y(at)) e^{tb} \\
m''_W(t) & =b(bm_Y(at) + am'_Y(at)) e^{b}+ (abm'_Y(at) + a^2m''_Y(at))e^{tb} \\
m''_W(0) & = b(b + a\mu_Y) + (ab\mu_Y + a^2 (\sigma^2_y + \mu^2_Y)) \\
& = a^2 \sigma_Y^2 + a^2\mu_Y^2 + 2ab\mu_Y + b^2 \\
& = a^2 \sigma_Y^2 + (a\mu_Y + b)^2
\end{align*}
[/tex]

The final line in the second bit is [tex] E[W^2][/tex], so the variance of W is

[tex]
a^2 \sigma^2_Y + (a\mu_Y+b)^2 - (a\mu_Y + b)^2 = a^2 \sigma^2_Y
[/tex]
 
  • #5


There are a few different approaches you can take to prove the variance of W is equal to the variance of Y multiplied by a^2. One approach is to use the definition of variance, which states that V(X) = E[(X - μ)^2], where μ is the mean or expected value of X.

Using this definition, we can start by finding the mean of W. Since W = aY + b, we can express the mean of W as E[W] = E[aY + b]. Using the linearity of expectation, we can rewrite this as E[W] = aE[Y] + b.

Next, we can find the variance of W by plugging in the definition of variance and the expression for the mean of W:
V(W) = E[(W - E[W])^2]
= E[(aY + b - (aE[Y] + b))^2]
= E[(aY - aE[Y])^2]
= a^2E[(Y - E[Y])^2]

Now, we can use the moment-generating functions of Y and W to simplify this expression. From the given information, we know that the moment-generating function of W is m(at) * e^(tb). Using the definition of moment-generating functions, we can write this as m(at) = E[e^(atY)].

Similarly, the moment-generating function of Y is m(t) = E[e^tY].

Using these two expressions, we can rewrite the variance of W as:
V(W) = a^2E[(Y - E[Y])^2]
= a^2E[e^(atY - atE[Y])]
= a^2E[e^(at(Y - E[Y]))]
= a^2m(at) * m(-at)

Since we know that m(t) * m(-t) = 1 (this is a property of moment-generating functions), we can simplify the above expression to:
V(W) = a^2m(at) * m(-at)
= a^2

Therefore, we have proven that V(W) = a^2, which is equal to V(Y) * a^2. This shows that the variance of W is equal to the variance of Y multiplied by a^2, as desired.

Another approach to this proof could be using the
 

Related to Proving Variance of a Random Variable with Moment-Generating Functions

1. What is "Basic Proof of Variance"?

"Basic Proof of Variance" is a statistical method used to determine the amount of variability or dispersion in a set of data. It measures how spread out the data is from the average or mean value.

2. How is "Basic Proof of Variance" calculated?

The formula for calculating "Basic Proof of Variance" is the sum of the squared differences between each data point and the mean, divided by the total number of data points.

3. Why is "Basic Proof of Variance" important?

"Basic Proof of Variance" is important because it helps us understand the variability of data and how much confidence we can have in our results. It is also used in many statistical tests to determine the significance of the results.

4. What is the difference between "Basic Proof of Variance" and standard deviation?

The "Basic Proof of Variance" and standard deviation are both measures of variability in a set of data. However, the standard deviation is the square root of the variance, making it more easily interpretable in the same units as the data. The variance is also more sensitive to extreme values, while the standard deviation is more robust.

5. How do you interpret the "Basic Proof of Variance" value?

The "Basic Proof of Variance" value is typically reported in squared units, making it difficult to interpret. To better understand its meaning, you can take the square root to obtain the standard deviation. A larger variance indicates a wider spread of data points, while a smaller variance indicates a more clustered distribution of data around the mean.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
816
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
502
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
971
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
2K
Back
Top