Computing Expected Value of Y

In summary, if X is a random variable with a uniform distribution, then: E[X] = X^2 + 1; E[Y] = X^2 + 1; Cov[X,Y] = E[XY] - E[X]E[Y]
  • #1
jetoso
73
0
Let X be a random variable with Uniform(0,1) distribution. Let Y = X^2 + 1. Compute E[Y].
The question here is if I should compute the expected value of Y integrating from 0 to 1 for x(x^2 + 1)dx?
 
Physics news on Phys.org
  • #2
Two ways to do it:

1. Calculate the expected value of x^2 + 1, integrating over x from 0 to 1.

2. Calculate the expected value of y, integrating over y from 1 to 2.
 
  • #3
integrating from 0 to 1 for x(x^2 + 1)dx?

Your integrand is wrong - leave out the x outside the parenthesis.
 
  • #4
Reply

juvenal said:
Two ways to do it:

1. Calculate the expected value of x^2 + 1, integrating over x from 0 to 1.

2. Calculate the expected value of y, integrating over y from 1 to 2.

I am a little confused here; for nonnegative r.v., say X, it is suppose that E[X]= integral from 0 to infinity of xF(dx) or xf(x); now, in this case which one is the pdf (probability density function)?

For the second case, could you please explain me why integration over y goes from 1 to 2?

Thanks.
 
  • #5
jetoso said:
I am a little confused here; for nonnegative r.v., say X, it is suppose that E[X]= integral from 0 to infinity of xF(dx) or xf(x); now, in this case which one is the pdf (probability density function)?

For the second case, could you please explain me why integration over y goes from 1 to 2?

Thanks.

First case - pdf (=f(x)) is uniform and takes on the value of 1 from 0 to 1, 0 outside.

Second case, change of variables. y = x^2 + 1. y(0) = 1. y(1) = 2.
 
  • #6
Reply

juvenal said:
First case - pdf (=f(x)) is uniform and takes on the value of 1 from 0 to 1, 0 outside.

Second case, change of variables. y = x^2 + 1. y(0) = 1. y(1) = 2.


Thank you!
 
  • #7
Reply

mathman said:
Your integrand is wrong - leave out the x outside the parenthesis.

How was that? Do you mean: Integration from 0 to 1 of (x^2+1)dx?
 
  • #8
How was that? Do you mean: Integration from 0 to 1 of (x^2+1)dx?

yes

In general if X is a random variable g(X) any function of X and f(x) the probability density function for X, then E(g(X))= integral g(x)f(x)dx.

In your case, f(x)=1 for 0<x<1, and f(x)=0 otherwise, while g(X)=X^2+1.
 
  • #9
Reply

mathman said:
yes

In general if X is a random variable g(X) any function of X and f(x) the probability density function for X, then E(g(X))= integral g(x)f(x)dx.

In your case, f(x)=1 for 0<x<1, and f(x)=0 otherwise, while g(X)=X^2+1.

If we were also interested in finding Var[Y] and Cov[X,Y], how can I compute, for instance E[Y^2] and E[XY]?
 
  • #10
Y2=X4+2X2+1
XY=X3+X
Thus you simply integrate the above X expressions beteeen 0 and 1.
 
  • #11
Answer

mathman said:
Y2=X4+2X2+1
XY=X3+X
Thus you simply integrate the above X expressions beteeen 0 and 1.
I just realized the following:
E[Y] = E[X^2 + 1 ] = E[X^2] + E[1] = 1/3 + 1 = 4/3
E[Y^2] = E[(X^2 + 1)^2] = E[(X^4 + 2X^2 + 1)] = 1/5 + 2/3 + 1 = 28/15
Var[Y] = E[Y^2] - (E[Y])^2 = 28/15 - (4/3)^2 = 28/15 - 16/9 = 4/45
E[XY] = E[X(X^2 + 1)] = E[X^3 + X] = E[X^3] + E[X] = 1/4 + 1/2 = 3/4
Cov[X,Y] = E[XY] - E[X]E[Y] = 3/4 - (1/2)(4/3) = 3/4 - 2/3 = 1/12

Am I right? Even doing the integration version the results hold.
 
Last edited:
  • #12
You certainly have the right idea. I haven't checked your arithmetic thoroughly, but it looks ok.
 

1. What is the definition of expected value?

The expected value of a random variable Y is the sum of the possible values of Y multiplied by their respective probabilities. In other words, it is the average value that we would expect to get if we were to repeat an experiment multiple times.

2. How is the expected value of Y calculated?

To calculate the expected value of Y, we multiply each possible value of Y by its probability and then add all of these products together. This can be represented mathematically as E(Y) = Σ(y * P(y)), where y represents each possible value of Y and P(y) represents the probability of getting that value.

3. What does the expected value represent?

The expected value represents the long-term average value that we would expect to get if we were to repeat an experiment multiple times. It is not necessarily the value that we will get on any single trial, but rather the average value over many trials.

4. Can the expected value of Y be negative?

Yes, the expected value of Y can be negative. This means that, on average, we would expect to get a negative value if we were to repeat the experiment multiple times. However, this does not necessarily mean that we will always get a negative value on any single trial.

5. How is the concept of expected value used in computing?

In computing, the concept of expected value is often used in decision making and optimization. By calculating the expected value of different outcomes, we can determine the best course of action to take in a given situation. It is also used in various statistical and machine learning algorithms to make predictions and estimate probabilities.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
332
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
823
Replies
0
Views
263
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
773
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
865
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
390
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
366
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
Back
Top