Squaring uniform/normal distribution and expectation

AI Thread Summary
The discussion focuses on the distribution of Y=X^2 when X is uniformly distributed on the interval [-a,a] and when X follows a standard normal distribution N(0,1). For the uniform case, the cumulative distribution function (CDF) of Y can be derived by calculating the probability P{Y≤y}, leading to a method for finding the probability density function (PDF) through differentiation. The expectation E[Y] can be calculated using the law of the lazy statistician, which states that E(g(X)) can be determined from the distribution of X without needing Y's distribution. In both cases, the expectation E[Y] relates to E[X^2], and for the normal distribution, Y follows a chi-squared distribution with one degree of freedom. Understanding these relationships clarifies the computations of expectations and variances for squared random variables.
rukawakaede
Messages
58
Reaction score
0
Suppose X is a uniformly distributed random variable on an interval [-a,a] for some real a.
Let Y=X^2. Then what could you say about this distribution of Y? I have no idea how to think about this distribution.
Also how could we compute the expectation of Y? I know that E[X]=0 but what could I conclude about E[Y]=E[X^2] and E[XY]=E[X^3]?
Is E[Y]=Var[X] since E[X]=0?

Similarly suppose X~N(0,1) be a standard normal random variable. What could we say about distribution of Y=X^2?

Hope someone could help solving my confusion.
 
Physics news on Phys.org
Hi rukawakaede, :smile:

The distribution of a square can easily be calculated as follows:

F_Y(y)=P\{Y\leq y\}=P\{X^2\leq y\}=P\{-\sqrt{y}\leq X\leq \sqrt{y}\}=P\{X\leq \sqrt{y}\}-P\{X<-\sqrt{y}\}=F_X(\sqrt{y})-F_X(-\sqrt{y})

where in the last step we've used that the distribution is continuous. Now, to obtain the pdf, just differentiate both sides.

Now, to obtain the expectation, you can calculate this with the distribution function obtained above. But there's a simpler way. The so-called "law of the lazy statistician" gives us that

E(g(X))=\int_{-\infty}^{+\infty}{g(x)f_X(x)dx}

So, in particular

E(X^2)=\int_{-\infty}^{+\infty}{x^2f_X(x)dx}

So, to obtain the expactation of X2, there is no need to know the distribution of X2. Only know the distribution of X is enough!
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top