Mean of a square of a random variable

  • #1
118
0

Homework Statement



If Z has a standard gaussian distribution then what is the distribution of
Z2 and what is its mean?

The Attempt at a Solution


Let T = Z2 Then we can get that

pdf T = e-T/2(1+1/T) x (1/√(2∏T)

I am not sure if this is correct and don't know how to find the mean,working out E[T] is very difficult. Can I just assume that the mean is 0 or not?
 

Answers and Replies

  • #2
Hi stukbv! :smile:

Did you know that the square of the standard gaussian distribution is the chi-square distribution with 1 degree of freedom?

Your pdf does not seem to be quite right though.
How did you get the factor (1+1/T)?
 
Last edited:
  • #3

Homework Statement



If Z has a standard gaussian distribution then what is the distribution of
Z2 and what is its mean?

The Attempt at a Solution


Let T = Z2 Then we can get that

pdf T = e-T/2(1+1/T) x (1/√(2∏T)

I am not sure if this is correct and don't know how to find the mean,working out E[T] is very difficult. Can I just assume that the mean is 0 or not?

Hey, it is going to be chi-squared distirbution with 1 degree of freedom. By definition of chi-squared. The mean follows from there as well.
 
  • #4
I know that its Chi Squared but my question requires me to work it through alternatively and then conclude it is a chi suared distribution (i.e. don't use the chi squared means etc at any point)

To get the pdf, i did FT = P[T≤t] = P[Z2≤T] = P[-√t ≤ Z ≤ √t ]
Then I did this integral and used -√t and √t as limits so i got -e-Z2/2 x 1/Z√(2∏) .
Then I differentiated my answer (FZ) to get fZ
 
  • #5
Good!

But let's see if I can reproduce that...

[tex]P(-\sqrt t \le Z \le \sqrt t) = \Phi(\sqrt t) - \Phi(-\sqrt t)[/tex]

Differentiate and find:
[tex]pdf(t) = {\phi(\sqrt t) \over 2\sqrt t} - {\phi(-\sqrt t) \over -2\sqrt t}[/tex][tex]
pdf(t) = {e^{-t \over 2} \over \sqrt{2\pi t}}[/tex]
 
  • #6
ok so when you do the first step and integrate, do you do it between √t and -√t or do you do it between √t and -∞ and then take off from -√t to -∞ I am not sure i understand what you have done?
 
  • #7
I've left a few steps out.
But you can do it either way.

If you have an integral from -√t to +√t, the result is the difference of the anti-derivative of both values.
[itex]\Phi[/itex] represents the anti-derivative, which is also the cdf of the standard Gaussian distribution.

When you take the derivative of [itex]\Phi[/itex], you get [itex]\phi[/itex], which represents the pdf of the standard Gaussian distribution.
 
  • #8
Sorry to be a pain but i still don't know how you get the pdf is that,
Ill show you what I did so you can see where I am going wrong,
T=Z2

P[T≤t] = ∫(1/√2∏)exp(-Z2/2)dZ between √t and -√t

So i got [ - exp(-Z2/2 x 1/ Z√2∏ ] between √t and -√t

Which then gives me -√2exp(-t/2) x 1/√(t∏) this is my CDF for T
So to get the pdf for T i differentiated the CDF wrt to t as below;

Quotient rule
Let u= -√2exp(-t/2) so du= exp(-t/2) / √2
Let v = √(t∏) dv = √∏/2√t

vdu - udv / v2

Gives exp(-t/2)/√(2t∏) + exp(-t/2)/t√(2t∏)

Where have I gone wrong??
 
  • #9
Sorry to be a pain but i still don't know how you get the pdf is that,
Ill show you what I did so you can see where I am going wrong,
T=Z2

P[T≤t] = ∫(1/√2∏)exp(-Z2/2)dZ between √t and -√t

So i got [ - exp(-Z2/2 x 1/ Z√2∏ ] between √t and -√t

This is not the anti-derivative.
Try to take the derivative and you should see that it does not match.

More specifically the function does not have an anti-derivative (expressed in standard functions).
It can only be approached numerically.
However, they have invented the symbol [itex]\Phi[/itex] to represent the anti-derivative.
 
  • #10
Ok I see, basically my problem is that the whole question i have to answer is;
Z1 and Z2 are standard normal variables and are iid.
(Y1,Y2) = (Z12+Z22 , Z12- Z22)

Compute mgf of bivariate RV (Y1,2) and the marginal mgfs, can you recognise any of the distributions?

So i thought that by working out the distribution of Y1 and Y2 (hence why i asked about working out Z2 ) then i could maybe work out the joint mgf from there but I'm beginning to think that's not the right way? I think once i can do the mgf i can do the rest but obviously I can't get started until i have the joint mgf!?
 
  • #11
I don't like tla's*! :frown:

What is: iid, mgf, RV?
(I don't feel like looking everything up and English is not my native language.)






*tla=three letter abbreviation :wink:.
 
  • #12
oh ok, iid means independent and identically distribute, mgf is moment generating function and RV random variable.
 
  • #13
It seems to me you have the right idea.
So it would be a matter of correctly defining the probabilities and properly integrating.
 
  • #14
Sorry to be a pain but i still don't know how you get the pdf is that,
Ill show you what I did so you can see where I am going wrong,
T=Z2

P[T≤t] = ∫(1/√2∏)exp(-Z2/2)dZ between √t and -√t

So i got [ - exp(-Z2/2 x 1/ Z√2∏ ] between √t and -√t

Which then gives me -√2exp(-t/2) x 1/√(t∏) this is my CDF for T
So to get the pdf for T i differentiated the CDF wrt to t as below;

Quotient rule
Let u= -√2exp(-t/2) so du= exp(-t/2) / √2
Let v = √(t∏) dv = √∏/2√t

vdu - udv / v2

Gives exp(-t/2)/√(2t∏) + exp(-t/2)/t√(2t∏)

Where have I gone wrong??

Your integral is wrong. The function exp(-t^2/2) is one of those functions that do NOT have an integral expressible in finite terms of elementary functions. That is why people invented the "erf" function, or the "Phi" function of probability theory. By the way: the statement about no finite elementary expression is a rigorously probable _theorem_: one can show that it is IMPOSSIBLE to have such a formula. This goes back to work of Liouville in the 19th century.

Anyway, you don't need to do the integral. You have F(x) = int[f(t), t=a(x)..b(x)], where f(t) = exp(-t^2/2)/sqrt(2*pi) and a(x) = -sqrt(x), b(x) = sqrt(x). From standard calculus results we have dF/dx = (db/dx)*f(b(x)) - (da/dx)*f(a(x)); that is, we just evaluate the integrand at the endpoint functions and multiply by the derivatives of the endpoint functions; the integration goes away.

RGV
 

Suggested for: Mean of a square of a random variable

Replies
1
Views
308
Replies
10
Views
460
Replies
11
Views
661
Replies
5
Views
601
Replies
1
Views
295
Replies
1
Views
446
Replies
1
Views
379
Replies
9
Views
772
Replies
4
Views
454
Back
Top