Expectation operator - linearity

Click For Summary
SUMMARY

The expectation operator E() is definitively a linear operator, demonstrated by the equation E(a\bar{x}+b\bar{y})=aE(\bar{x})+bE(\bar{y}). This conclusion is supported by the integral representation of the expectation, E(\bar{x})=\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx, where f_{\bar{x}} is the probability density function of the random variable x. The discussion highlights the necessity of understanding the relationship between the probability density functions of combined random variables, particularly through convolution for independent variables, while also noting that E(aX+bY)=aE(X)+bE(Y) holds true in general.

PREREQUISITES
  • Understanding of linear operators in probability theory
  • Familiarity with probability density functions (pdfs)
  • Knowledge of convolution in the context of random variables
  • Basic concepts of measure theory (for advanced understanding)
NEXT STEPS
  • Study the properties of linear operators in probability theory
  • Learn about convolution and its applications in probability distributions
  • Explore the "theorem of the unconscious statistician" and its implications
  • Investigate measure theory and its relevance to probability and statistics
USEFUL FOR

Students of statistics, mathematicians, and data scientists who are looking to deepen their understanding of expectation operators and their properties in probability theory.

Pietair
Messages
57
Reaction score
0

Homework Statement


Show that the expectation operator E() is a linear operator, or, implying:
E(a\bar{x}+b\bar{y})=aE(\bar{x})+bE(\bar{y})

Homework Equations


E(\bar{x})=\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx

With f_{\bar{x}} the probability density function of random variable x.

The Attempt at a Solution


aE(\bar{x})=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx and:
bE(\bar{y})=b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy

Introducing a new random variable:
\bar{v}=a\bar{x}+b\bar{y}

Then:

E(\bar{v})=E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}vf_{\bar{v}}(v)dv=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv

And accordingly:
E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv

So what remains to proof is that:

a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx+b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy

And now I am stuck... I don't know how I can relate the p.d.f. of random variable v to the p.d.f.'s of random variables x and y.

Thank you in advance!
 
Physics news on Phys.org
Pietair said:

Homework Statement


Show that the expectation operator E() is a linear operator, or, implying:
E(a\bar{x}+b\bar{y})=aE(\bar{x})+bE(\bar{y})

Homework Equations


E(\bar{x})=\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx

With f_{\bar{x}} the probability density function of random variable x.

The Attempt at a Solution


aE(\bar{x})=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx and:
bE(\bar{y})=b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy

Introducing a new random variable:
\bar{v}=a\bar{x}+b\bar{y}

Then:

E(\bar{v})=E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}vf_{\bar{v}}(v)dv=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv

And accordingly:
E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv

So what remains to proof is that:

a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx+b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy

And now I am stuck... I don't know how I can relate the p.d.f. of random variable v to the p.d.f.'s of random variables x and y.

Thank you in advance!

Haven't you seen some kind of results that gives the pdf of X+Y in terms of the pdf of X and Y?? Hint: it has to do with convolution.
 
Thank you for your reply.

I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.
 
Pietair said:
I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.
Yes to both.
 
Pietair said:
Thank you for your reply.

I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.

Depending on the level of rigor required, you may have to qualify things a bit. For example, if Y = -X, then E(X+Y) = EX + EY is false if EX = EY = ∞, because we would be trying to equate 0 to ∞ - ∞, which is not allowed.

So, the easiest way is to assume that both EX and EY are finite. Now you really have a 2-stage task:
(1) Prove the "theorem of the unconscious statistician", which says that if Z = g(X,Y), then
EZ \equiv \int z \: dF_Z(z)
can be written as
\int g(x,y) \: d^2F_{XY}(x,y),
which becomes
\sum_{x,y} g(x,y)\: P_{XY}(x,y)
in the discrete case where there are no densities, and becomes
\int g(x,y) f_{XY}(x,y) \, dx \, dy
in the continuous case where there are densities. Of course, the result is also true in a mixed continuous-discrete case where there are densities and point-mass probabilities, but then we need to write Stieltjes integrals, etc. Then, you need to do the much easier task of proving that
\int (ax + by) f_{XY}(x,y) \, dx \, dy<br /> = a \int x f_X(x) \, dx + b \int y f_Y(y) \, dy = a EX + b EY .
The last form holds in general, whether the random variables are continuous, discrete or mixed. As you say, it holds even when X and Y are dependent.
 
Last edited:
Ray Vickson said:
Depending on the level of rigor required, you may have to qualify things a bit. For example, if Y = -X, then E(X+Y) = EX + EY is false if EX = EY = ∞, because we would be trying to equate 0 to ∞ - ∞, which is not allowed.

So, the easiest way is to assume that both EX and EY are finite. Now you really have a 2-stage task:
(1) Prove the "theorem of the unconscious statistician", which says that if Z = g(X,Y), then
EZ \equiv \int z \: dF_Z(z)
can be written as
\int g(x,y) \: d^2F_{XY}(x,y),
which becomes
\sum_{x,y} g(x,y)\: P_{XY}(x,y)
in the discrete case where there are no densities, and becomes
\int g(x,y) f_{XY}(x,y) \, dx \, dy
in the continuous case where there are densities. Of course, the result is also true in a mixed continuous-discrete case where there are densities and point-mass probabilities, but then we need to write Stieltjes integrals, etc. Then, you need to do the much easier task of proving that
\int (ax + by) f_{XY}(x,y) \, dx \, dy<br /> = a \int x f_X(x) \, dx + b \int y f_Y(y) \, dy = a EX + b EY .
The last form holds in general, whether the random variables are continuous, discrete or mixed. As you say, it holds even when X and Y are dependent.

Hmm, so there really is no elementary proof?? (I guess it depends on what you call elementary though). This kind of makes me happy that I know measure theory, it makes the proof of this result so much simpler.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
21
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K