# Mean of a square of a random variable

1. Nov 7, 2011

### stukbv

1. The problem statement, all variables and given/known data

If Z has a standard gaussian distribution then what is the distribution of
Z2 and what is its mean?

3. The attempt at a solution
Let T = Z2 Then we can get that

pdf T = e-T/2(1+1/T) x (1/√(2∏T)

I am not sure if this is correct and dont know how to find the mean,working out E[T] is very difficult. Can I just assume that the mean is 0 or not?

2. Nov 7, 2011

### I like Serena

Hi stukbv!

Did you know that the square of the standard gaussian distribution is the chi-square distribution with 1 degree of freedom?

Your pdf does not seem to be quite right though.
How did you get the factor (1+1/T)?

Last edited: Nov 7, 2011
3. Nov 7, 2011

### WantToBeSmart

Hey, it is going to be chi-squared distirbution with 1 degree of freedom. By definition of chi-squared. The mean follows from there as well.

4. Nov 7, 2011

### stukbv

I know that its Chi Squared but my question requires me to work it through alternatively and then conclude it is a chi suared distribution (i.e. don't use the chi squared means etc at any point)

To get the pdf, i did FT = P[T≤t] = P[Z2≤T] = P[-√t ≤ Z ≤ √t ]
Then I did this integral and used -√t and √t as limits so i got -e-Z2/2 x 1/Z√(2∏) .
Then I differentiated my answer (FZ) to get fZ

5. Nov 7, 2011

### I like Serena

Good!

But let's see if I can reproduce that...

$$P(-\sqrt t \le Z \le \sqrt t) = \Phi(\sqrt t) - \Phi(-\sqrt t)$$

Differentiate and find:
$$pdf(t) = {\phi(\sqrt t) \over 2\sqrt t} - {\phi(-\sqrt t) \over -2\sqrt t}$$$$pdf(t) = {e^{-t \over 2} \over \sqrt{2\pi t}}$$

6. Nov 7, 2011

### stukbv

ok so when you do the first step and integrate, do you do it between √t and -√t or do you do it between √t and -∞ and then take off from -√t to -∞ im not sure i understand what you have done?

7. Nov 7, 2011

### I like Serena

I've left a few steps out.
But you can do it either way.

If you have an integral from -√t to +√t, the result is the difference of the anti-derivative of both values.
$\Phi$ represents the anti-derivative, which is also the cdf of the standard Gaussian distribution.

When you take the derivative of $\Phi$, you get $\phi$, which represents the pdf of the standard Gaussian distribution.

8. Nov 8, 2011

### stukbv

Sorry to be a pain but i still dont know how you get the pdf is that,
Ill show you what I did so you can see where im going wrong,
T=Z2

P[T≤t] = ∫(1/√2∏)exp(-Z2/2)dZ between √t and -√t

So i got [ - exp(-Z2/2 x 1/ Z√2∏ ] between √t and -√t

Which then gives me -√2exp(-t/2) x 1/√(t∏) this is my CDF for T
So to get the pdf for T i differentiated the CDF wrt to t as below;

Quotient rule
Let u= -√2exp(-t/2) so du= exp(-t/2) / √2
Let v = √(t∏) dv = √∏/2√t

vdu - udv / v2

Gives exp(-t/2)/√(2t∏) + exp(-t/2)/t√(2t∏)

Where have I gone wrong??

9. Nov 8, 2011

### I like Serena

This is not the anti-derivative.
Try to take the derivative and you should see that it does not match.

More specifically the function does not have an anti-derivative (expressed in standard functions).
It can only be approached numerically.
However, they have invented the symbol $\Phi$ to represent the anti-derivative.

10. Nov 8, 2011

### stukbv

Ok I see, basically my problem is that the whole question i have to answer is;
Z1 and Z2 are standard normal variables and are iid.
(Y1,Y2) = (Z12+Z22 , Z12- Z22)

Compute mgf of bivariate RV (Y1,2) and the marginal mgfs, can you recognise any of the distributions?

So i thought that by working out the distribution of Y1 and Y2 (hence why i asked about working out Z2 ) then i could maybe work out the joint mgf from there but I'm beginning to think thats not the right way??? I think once i can do the mgf i can do the rest but obviously I cant get started until i have the joint mgf!?

11. Nov 8, 2011

### I like Serena

I don't like tla's*!

What is: iid, mgf, RV?
(I don't feel like looking everything up and English is not my native language.)

*tla=three letter abbreviation .

12. Nov 8, 2011

### stukbv

oh ok, iid means independent and identically distribute, mgf is moment generating function and RV random variable.

13. Nov 8, 2011

### I like Serena

It seems to me you have the right idea.
So it would be a matter of correctly defining the probabilities and properly integrating.

14. Nov 8, 2011

### Ray Vickson

Your integral is wrong. The function exp(-t^2/2) is one of those functions that do NOT have an integral expressible in finite terms of elementary functions. That is why people invented the "erf" function, or the "Phi" function of probability theory. By the way: the statement about no finite elementary expression is a rigorously probable _theorem_: one can show that it is IMPOSSIBLE to have such a formula. This goes back to work of Liouville in the 19th century.

Anyway, you don't need to do the integral. You have F(x) = int[f(t), t=a(x)..b(x)], where f(t) = exp(-t^2/2)/sqrt(2*pi) and a(x) = -sqrt(x), b(x) = sqrt(x). From standard calculus results we have dF/dx = (db/dx)*f(b(x)) - (da/dx)*f(a(x)); that is, we just evaluate the integrand at the endpoint functions and multiply by the derivatives of the endpoint functions; the integration goes away.

RGV