Expectation operator - linearity

Click For Summary
The discussion focuses on proving the linearity of the expectation operator, specifically that E(aX + bY) = aE(X) + bE(Y). Participants explore the relationship between the probability density functions (pdfs) of random variables and the convolution theorem, noting that while convolution applies to independent variables, the linearity of expectation holds generally. A key point raised is the need to ensure that the expected values are finite to avoid contradictions in cases like Y = -X. The conversation also touches on the "theorem of the unconscious statistician," which provides a framework for calculating expectations in terms of joint distributions. Overall, the proof requires understanding both the properties of expectation and the relationship between random variables' distributions.
Pietair
Messages
57
Reaction score
0

Homework Statement


Show that the expectation operator E() is a linear operator, or, implying:
E(a\bar{x}+b\bar{y})=aE(\bar{x})+bE(\bar{y})

Homework Equations


E(\bar{x})=\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx

With f_{\bar{x}} the probability density function of random variable x.

The Attempt at a Solution


aE(\bar{x})=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx and:
bE(\bar{y})=b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy

Introducing a new random variable:
\bar{v}=a\bar{x}+b\bar{y}

Then:

E(\bar{v})=E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}vf_{\bar{v}}(v)dv=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv

And accordingly:
E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv

So what remains to proof is that:

a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx+b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy

And now I am stuck... I don't know how I can relate the p.d.f. of random variable v to the p.d.f.'s of random variables x and y.

Thank you in advance!
 
Physics news on Phys.org
Pietair said:

Homework Statement


Show that the expectation operator E() is a linear operator, or, implying:
E(a\bar{x}+b\bar{y})=aE(\bar{x})+bE(\bar{y})

Homework Equations


E(\bar{x})=\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx

With f_{\bar{x}} the probability density function of random variable x.

The Attempt at a Solution


aE(\bar{x})=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx and:
bE(\bar{y})=b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy

Introducing a new random variable:
\bar{v}=a\bar{x}+b\bar{y}

Then:

E(\bar{v})=E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}vf_{\bar{v}}(v)dv=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv

And accordingly:
E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv

So what remains to proof is that:

a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx+b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy

And now I am stuck... I don't know how I can relate the p.d.f. of random variable v to the p.d.f.'s of random variables x and y.

Thank you in advance!

Haven't you seen some kind of results that gives the pdf of X+Y in terms of the pdf of X and Y?? Hint: it has to do with convolution.
 
Thank you for your reply.

I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.
 
Pietair said:
I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.
Yes to both.
 
Pietair said:
Thank you for your reply.

I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.

Depending on the level of rigor required, you may have to qualify things a bit. For example, if Y = -X, then E(X+Y) = EX + EY is false if EX = EY = ∞, because we would be trying to equate 0 to ∞ - ∞, which is not allowed.

So, the easiest way is to assume that both EX and EY are finite. Now you really have a 2-stage task:
(1) Prove the "theorem of the unconscious statistician", which says that if Z = g(X,Y), then
EZ \equiv \int z \: dF_Z(z)
can be written as
\int g(x,y) \: d^2F_{XY}(x,y),
which becomes
\sum_{x,y} g(x,y)\: P_{XY}(x,y)
in the discrete case where there are no densities, and becomes
\int g(x,y) f_{XY}(x,y) \, dx \, dy
in the continuous case where there are densities. Of course, the result is also true in a mixed continuous-discrete case where there are densities and point-mass probabilities, but then we need to write Stieltjes integrals, etc. Then, you need to do the much easier task of proving that
\int (ax + by) f_{XY}(x,y) \, dx \, dy<br /> = a \int x f_X(x) \, dx + b \int y f_Y(y) \, dy = a EX + b EY .
The last form holds in general, whether the random variables are continuous, discrete or mixed. As you say, it holds even when X and Y are dependent.
 
Last edited:
Ray Vickson said:
Depending on the level of rigor required, you may have to qualify things a bit. For example, if Y = -X, then E(X+Y) = EX + EY is false if EX = EY = ∞, because we would be trying to equate 0 to ∞ - ∞, which is not allowed.

So, the easiest way is to assume that both EX and EY are finite. Now you really have a 2-stage task:
(1) Prove the "theorem of the unconscious statistician", which says that if Z = g(X,Y), then
EZ \equiv \int z \: dF_Z(z)
can be written as
\int g(x,y) \: d^2F_{XY}(x,y),
which becomes
\sum_{x,y} g(x,y)\: P_{XY}(x,y)
in the discrete case where there are no densities, and becomes
\int g(x,y) f_{XY}(x,y) \, dx \, dy
in the continuous case where there are densities. Of course, the result is also true in a mixed continuous-discrete case where there are densities and point-mass probabilities, but then we need to write Stieltjes integrals, etc. Then, you need to do the much easier task of proving that
\int (ax + by) f_{XY}(x,y) \, dx \, dy<br /> = a \int x f_X(x) \, dx + b \int y f_Y(y) \, dy = a EX + b EY .
The last form holds in general, whether the random variables are continuous, discrete or mixed. As you say, it holds even when X and Y are dependent.

Hmm, so there really is no elementary proof?? (I guess it depends on what you call elementary though). This kind of makes me happy that I know measure theory, it makes the proof of this result so much simpler.
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
21
Views
2K
Replies
10
Views
2K