Mean of a square of a random variable

Click For Summary

Homework Help Overview

The discussion revolves around the distribution of the square of a standard Gaussian random variable, specifically Z², and the challenge of determining its mean. Participants explore the relationship between Z² and the chi-square distribution, as well as the difficulties in deriving the probability density function (pdf) and the mean of Z².

Discussion Character

  • Exploratory, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the pdf of Z² and its relation to the chi-square distribution with 1 degree of freedom. There are attempts to derive the pdf through integration and differentiation, with some questioning the correctness of their methods. Others express uncertainty about the assumptions regarding the mean of Z².

Discussion Status

There is an ongoing exploration of the pdf and mean of Z², with some participants providing insights into the relationship with the chi-square distribution. Multiple interpretations and methods are being discussed, but no consensus has been reached on the correct approach or solution.

Contextual Notes

Participants note the requirement to derive the distribution without using established results about the chi-square distribution, which adds complexity to the problem. There is also mention of the need to compute moment generating functions for related random variables, indicating a broader context for the discussion.

stukbv
Messages
112
Reaction score
0

Homework Statement



If Z has a standard gaussian distribution then what is the distribution of
Z2 and what is its mean?

The Attempt at a Solution


Let T = Z2 Then we can get that

pdf T = e-T/2(1+1/T) x (1/√(2∏T)

I am not sure if this is correct and don't know how to find the mean,working out E[T] is very difficult. Can I just assume that the mean is 0 or not?
 
Physics news on Phys.org
Hi stukbv! :smile:

Did you know that the square of the standard gaussian distribution is the chi-square distribution with 1 degree of freedom?

Your pdf does not seem to be quite right though.
How did you get the factor (1+1/T)?
 
Last edited:
stukbv said:

Homework Statement



If Z has a standard gaussian distribution then what is the distribution of
Z2 and what is its mean?

The Attempt at a Solution


Let T = Z2 Then we can get that

pdf T = e-T/2(1+1/T) x (1/√(2∏T)

I am not sure if this is correct and don't know how to find the mean,working out E[T] is very difficult. Can I just assume that the mean is 0 or not?

Hey, it is going to be chi-squared distirbution with 1 degree of freedom. By definition of chi-squared. The mean follows from there as well.
 
I know that its Chi Squared but my question requires me to work it through alternatively and then conclude it is a chi suared distribution (i.e. don't use the chi squared means etc at any point)

To get the pdf, i did FT = P[T≤t] = P[Z2≤T] = P[-√t ≤ Z ≤ √t ]
Then I did this integral and used -√t and √t as limits so i got -e-Z2/2 x 1/Z√(2∏) .
Then I differentiated my answer (FZ) to get fZ
 
Good!

But let's see if I can reproduce that...

[tex]P(-\sqrt t \le Z \le \sqrt t) = \Phi(\sqrt t) - \Phi(-\sqrt t)[/tex]

Differentiate and find:
[tex]pdf(t) = {\phi(\sqrt t) \over 2\sqrt t} - {\phi(-\sqrt t) \over -2\sqrt t}[/tex][tex] pdf(t) = {e^{-t \over 2} \over \sqrt{2\pi t}}[/tex]
 
ok so when you do the first step and integrate, do you do it between √t and -√t or do you do it between √t and -∞ and then take off from -√t to -∞ I am not sure i understand what you have done?
 
I've left a few steps out.
But you can do it either way.

If you have an integral from -√t to +√t, the result is the difference of the anti-derivative of both values.
[itex]\Phi[/itex] represents the anti-derivative, which is also the cdf of the standard Gaussian distribution.

When you take the derivative of [itex]\Phi[/itex], you get [itex]\phi[/itex], which represents the pdf of the standard Gaussian distribution.
 
Sorry to be a pain but i still don't know how you get the pdf is that,
Ill show you what I did so you can see where I am going wrong,
T=Z2

P[T≤t] = ∫(1/√2∏)exp(-Z2/2)dZ between √t and -√t

So i got [ - exp(-Z2/2 x 1/ Z√2∏ ] between √t and -√t

Which then gives me -√2exp(-t/2) x 1/√(t∏) this is my CDF for T
So to get the pdf for T i differentiated the CDF wrt to t as below;

Quotient rule
Let u= -√2exp(-t/2) so du= exp(-t/2) / √2
Let v = √(t∏) dv = √∏/2√t

vdu - udv / v2

Gives exp(-t/2)/√(2t∏) + exp(-t/2)/t√(2t∏)

Where have I gone wrong??
 
stukbv said:
Sorry to be a pain but i still don't know how you get the pdf is that,
Ill show you what I did so you can see where I am going wrong,
T=Z2

P[T≤t] = ∫(1/√2∏)exp(-Z2/2)dZ between √t and -√t

So i got [ - exp(-Z2/2 x 1/ Z√2∏ ] between √t and -√t

This is not the anti-derivative.
Try to take the derivative and you should see that it does not match.

More specifically the function does not have an anti-derivative (expressed in standard functions).
It can only be approached numerically.
However, they have invented the symbol [itex]\Phi[/itex] to represent the anti-derivative.
 
  • #10
Ok I see, basically my problem is that the whole question i have to answer is;
Z1 and Z2 are standard normal variables and are iid.
(Y1,Y2) = (Z12+Z22 , Z12- Z22)

Compute mgf of bivariate RV (Y1,2) and the marginal mgfs, can you recognise any of the distributions?

So i thought that by working out the distribution of Y1 and Y2 (hence why i asked about working out Z2 ) then i could maybe work out the joint mgf from there but I'm beginning to think that's not the right way? I think once i can do the mgf i can do the rest but obviously I can't get started until i have the joint mgf!?
 
  • #11
I don't like tla's*! :frown:

What is: iid, mgf, RV?
(I don't feel like looking everything up and English is not my native language.)






*tla=three letter abbreviation :wink:.
 
  • #12
oh ok, iid means independent and identically distribute, mgf is moment generating function and RV random variable.
 
  • #13
It seems to me you have the right idea.
So it would be a matter of correctly defining the probabilities and properly integrating.
 
  • #14
stukbv said:
Sorry to be a pain but i still don't know how you get the pdf is that,
Ill show you what I did so you can see where I am going wrong,
T=Z2

P[T≤t] = ∫(1/√2∏)exp(-Z2/2)dZ between √t and -√t

So i got [ - exp(-Z2/2 x 1/ Z√2∏ ] between √t and -√t

Which then gives me -√2exp(-t/2) x 1/√(t∏) this is my CDF for T
So to get the pdf for T i differentiated the CDF wrt to t as below;

Quotient rule
Let u= -√2exp(-t/2) so du= exp(-t/2) / √2
Let v = √(t∏) dv = √∏/2√t

vdu - udv / v2

Gives exp(-t/2)/√(2t∏) + exp(-t/2)/t√(2t∏)

Where have I gone wrong??

Your integral is wrong. The function exp(-t^2/2) is one of those functions that do NOT have an integral expressible in finite terms of elementary functions. That is why people invented the "erf" function, or the "Phi" function of probability theory. By the way: the statement about no finite elementary expression is a rigorously probable _theorem_: one can show that it is IMPOSSIBLE to have such a formula. This goes back to work of Liouville in the 19th century.

Anyway, you don't need to do the integral. You have F(x) = int[f(t), t=a(x)..b(x)], where f(t) = exp(-t^2/2)/sqrt(2*pi) and a(x) = -sqrt(x), b(x) = sqrt(x). From standard calculus results we have dF/dx = (db/dx)*f(b(x)) - (da/dx)*f(a(x)); that is, we just evaluate the integrand at the endpoint functions and multiply by the derivatives of the endpoint functions; the integration goes away.

RGV
 

Similar threads

Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
6
Views
1K
Replies
2
Views
2K
Replies
1
Views
2K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 22 ·
Replies
22
Views
2K
Replies
0
Views
1K