Is Integrating x(x^2 + 1) from 0 to 1 the Correct Method to Compute E[Y]?

jetoso
Messages
73
Reaction score
0
Let X be a random variable with Uniform(0,1) distribution. Let Y = X^2 + 1. Compute E[Y].
The question here is if I should compute the expected value of Y integrating from 0 to 1 for x(x^2 + 1)dx?
 
Physics news on Phys.org
Two ways to do it:

1. Calculate the expected value of x^2 + 1, integrating over x from 0 to 1.

2. Calculate the expected value of y, integrating over y from 1 to 2.
 
integrating from 0 to 1 for x(x^2 + 1)dx?

Your integrand is wrong - leave out the x outside the parenthesis.
 
Reply

juvenal said:
Two ways to do it:

1. Calculate the expected value of x^2 + 1, integrating over x from 0 to 1.

2. Calculate the expected value of y, integrating over y from 1 to 2.

I am a little confused here; for nonnegative r.v., say X, it is suppose that E[X]= integral from 0 to infinity of xF(dx) or xf(x); now, in this case which one is the pdf (probability density function)?

For the second case, could you please explain me why integration over y goes from 1 to 2?

Thanks.
 
jetoso said:
I am a little confused here; for nonnegative r.v., say X, it is suppose that E[X]= integral from 0 to infinity of xF(dx) or xf(x); now, in this case which one is the pdf (probability density function)?

For the second case, could you please explain me why integration over y goes from 1 to 2?

Thanks.

First case - pdf (=f(x)) is uniform and takes on the value of 1 from 0 to 1, 0 outside.

Second case, change of variables. y = x^2 + 1. y(0) = 1. y(1) = 2.
 
Reply

juvenal said:
First case - pdf (=f(x)) is uniform and takes on the value of 1 from 0 to 1, 0 outside.

Second case, change of variables. y = x^2 + 1. y(0) = 1. y(1) = 2.


Thank you!
 
Reply

mathman said:
Your integrand is wrong - leave out the x outside the parenthesis.

How was that? Do you mean: Integration from 0 to 1 of (x^2+1)dx?
 
How was that? Do you mean: Integration from 0 to 1 of (x^2+1)dx?

yes

In general if X is a random variable g(X) any function of X and f(x) the probability density function for X, then E(g(X))= integral g(x)f(x)dx.

In your case, f(x)=1 for 0<x<1, and f(x)=0 otherwise, while g(X)=X^2+1.
 
Reply

mathman said:
yes

In general if X is a random variable g(X) any function of X and f(x) the probability density function for X, then E(g(X))= integral g(x)f(x)dx.

In your case, f(x)=1 for 0<x<1, and f(x)=0 otherwise, while g(X)=X^2+1.

If we were also interested in finding Var[Y] and Cov[X,Y], how can I compute, for instance E[Y^2] and E[XY]?
 
  • #10
Y2=X4+2X2+1
XY=X3+X
Thus you simply integrate the above X expressions beteeen 0 and 1.
 
  • #11
Answer

mathman said:
Y2=X4+2X2+1
XY=X3+X
Thus you simply integrate the above X expressions beteeen 0 and 1.
I just realized the following:
E[Y] = E[X^2 + 1 ] = E[X^2] + E[1] = 1/3 + 1 = 4/3
E[Y^2] = E[(X^2 + 1)^2] = E[(X^4 + 2X^2 + 1)] = 1/5 + 2/3 + 1 = 28/15
Var[Y] = E[Y^2] - (E[Y])^2 = 28/15 - (4/3)^2 = 28/15 - 16/9 = 4/45
E[XY] = E[X(X^2 + 1)] = E[X^3 + X] = E[X^3] + E[X] = 1/4 + 1/2 = 3/4
Cov[X,Y] = E[XY] - E[X]E[Y] = 3/4 - (1/2)(4/3) = 3/4 - 2/3 = 1/12

Am I right? Even doing the integration version the results hold.
 
Last edited:
  • #12
You certainly have the right idea. I haven't checked your arithmetic thoroughly, but it looks ok.
 
Back
Top