Expectation operator - linearity

I am not sure how to express this in english, I am not a native speaker).Thanks again for your reply, you helped me a lot.In summary, the expectation operator E() is a linear operator, meaning that E(a\bar{x}+b\bar{y})=aE(\bar{x})+bE(\bar{y}) for any constants a and b, regardless of the dependency between the random variables \bar{x} and \bar{y}. This can be proven by using the "theorem of the unconscious statistician", which states that the expectation of a function of random variables can be written as an integral or sum involving the joint probability distribution of the random variables.
  • #1
Pietair
59
0

Homework Statement


Show that the expectation operator E() is a linear operator, or, implying:
[tex]E(a\bar{x}+b\bar{y})=aE(\bar{x})+bE(\bar{y})[/tex]

Homework Equations


[tex]E(\bar{x})=\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx[/tex]

With [tex]f_{\bar{x}}[/tex] the probability density function of random variable x.

The Attempt at a Solution


[tex]aE(\bar{x})=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx[/tex] and:
[tex]bE(\bar{y})=b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy[/tex]

Introducing a new random variable:
[tex]\bar{v}=a\bar{x}+b\bar{y}[/tex]

Then:

[tex]E(\bar{v})=E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}vf_{\bar{v}}(v)dv=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv[/tex]

And accordingly:
[tex]E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv[/tex]

So what remains to proof is that:

[tex]a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx+b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy[/tex]

And now I am stuck... I don't know how I can relate the p.d.f. of random variable v to the p.d.f.'s of random variables x and y.

Thank you in advance!
 
Physics news on Phys.org
  • #2
Pietair said:

Homework Statement


Show that the expectation operator E() is a linear operator, or, implying:
[tex]E(a\bar{x}+b\bar{y})=aE(\bar{x})+bE(\bar{y})[/tex]

Homework Equations


[tex]E(\bar{x})=\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx[/tex]

With [tex]f_{\bar{x}}[/tex] the probability density function of random variable x.

The Attempt at a Solution


[tex]aE(\bar{x})=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx[/tex] and:
[tex]bE(\bar{y})=b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy[/tex]

Introducing a new random variable:
[tex]\bar{v}=a\bar{x}+b\bar{y}[/tex]

Then:

[tex]E(\bar{v})=E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}vf_{\bar{v}}(v)dv=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv[/tex]

And accordingly:
[tex]E(a\bar{x}+b\bar{y})=\int_{-\infty}^{+\infty}(ax+by)f_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv[/tex]

So what remains to proof is that:

[tex]a\int_{-\infty}^{+\infty}xf_{\bar{v}}(v)dv+b\int_{-\infty}^{+\infty}yf_{\bar{v}}(v)dv=a\int_{-\infty}^{+\infty}xf_{\bar{x}}(x)dx+b\int_{-\infty}^{+\infty}yf_{\bar{y}}(y)dy[/tex]

And now I am stuck... I don't know how I can relate the p.d.f. of random variable v to the p.d.f.'s of random variables x and y.

Thank you in advance!

Haven't you seen some kind of results that gives the pdf of X+Y in terms of the pdf of X and Y?? Hint: it has to do with convolution.
 
  • #3
Thank you for your reply.

I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.
 
  • #4
Pietair said:
I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.
Yes to both.
 
  • #5
Pietair said:
Thank you for your reply.

I know that the probability distribution of the sum of two or more random variables is the convolution of their individual pdf's, but as far as I know this is only valid for independent random variables. While ##E(aX+bY)=aE(X)+bE(Y)## is true in general, right?.

Depending on the level of rigor required, you may have to qualify things a bit. For example, if Y = -X, then E(X+Y) = EX + EY is false if EX = EY = ∞, because we would be trying to equate 0 to ∞ - ∞, which is not allowed.

So, the easiest way is to assume that both EX and EY are finite. Now you really have a 2-stage task:
(1) Prove the "theorem of the unconscious statistician", which says that if Z = g(X,Y), then
[tex] EZ \equiv \int z \: dF_Z(z) [/tex]
can be written as
[tex] \int g(x,y) \: d^2F_{XY}(x,y),[/tex]
which becomes
[tex] \sum_{x,y} g(x,y)\: P_{XY}(x,y)[/tex]
in the discrete case where there are no densities, and becomes
[tex] \int g(x,y) f_{XY}(x,y) \, dx \, dy [/tex]
in the continuous case where there are densities. Of course, the result is also true in a mixed continuous-discrete case where there are densities and point-mass probabilities, but then we need to write Stieltjes integrals, etc. Then, you need to do the much easier task of proving that
[tex] \int (ax + by) f_{XY}(x,y) \, dx \, dy
= a \int x f_X(x) \, dx + b \int y f_Y(y) \, dy = a EX + b EY .[/tex]
The last form holds in general, whether the random variables are continuous, discrete or mixed. As you say, it holds even when X and Y are dependent.
 
Last edited:
  • #6
Ray Vickson said:
Depending on the level of rigor required, you may have to qualify things a bit. For example, if Y = -X, then E(X+Y) = EX + EY is false if EX = EY = ∞, because we would be trying to equate 0 to ∞ - ∞, which is not allowed.

So, the easiest way is to assume that both EX and EY are finite. Now you really have a 2-stage task:
(1) Prove the "theorem of the unconscious statistician", which says that if Z = g(X,Y), then
[tex] EZ \equiv \int z \: dF_Z(z) [/tex]
can be written as
[tex] \int g(x,y) \: d^2F_{XY}(x,y),[/tex]
which becomes
[tex] \sum_{x,y} g(x,y)\: P_{XY}(x,y)[/tex]
in the discrete case where there are no densities, and becomes
[tex] \int g(x,y) f_{XY}(x,y) \, dx \, dy [/tex]
in the continuous case where there are densities. Of course, the result is also true in a mixed continuous-discrete case where there are densities and point-mass probabilities, but then we need to write Stieltjes integrals, etc. Then, you need to do the much easier task of proving that
[tex] \int (ax + by) f_{XY}(x,y) \, dx \, dy
= a \int x f_X(x) \, dx + b \int y f_Y(y) \, dy = a EX + b EY .[/tex]
The last form holds in general, whether the random variables are continuous, discrete or mixed. As you say, it holds even when X and Y are dependent.

Hmm, so there really is no elementary proof?? (I guess it depends on what you call elementary though). This kind of makes me happy that I know measure theory, it makes the proof of this result so much simpler.
 

1. What is the definition of the expectation operator?

The expectation operator, denoted as E[ ], is a mathematical function that calculates the weighted average value of a random variable. It represents the mean or average outcome that is expected to occur from a given experiment or process.

2. What does it mean for the expectation operator to be linear?

A linear expectation operator means that it follows the principle of superposition, which states that the expected value of a sum of two random variables is equal to the sum of their individual expected values. In other words, E[aX + bY] = aE[X] + bE[Y], where a and b are constants and X and Y are random variables.

3. How is the expectation operator used in probability and statistics?

In probability and statistics, the expectation operator is used to calculate the expected value of a random variable, which is a key concept in understanding the behavior and outcomes of various experiments and processes. It is also used in the calculation of other important measures such as variance, covariance, and correlation.

4. Can the expectation operator be applied to any type of random variable?

Yes, the expectation operator can be applied to any type of random variable, including discrete and continuous variables. However, the method of calculation may differ depending on the type of variable and its probability distribution.

5. How is the linearity of the expectation operator useful in real-world applications?

The linearity of the expectation operator is useful in many real-world applications, particularly in finance and economics. It allows for the simplification of complex calculations and the prediction of outcomes based on the expected values of different variables. It is also used in decision-making processes, such as risk analysis and portfolio management.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
166
  • Calculus and Beyond Homework Help
Replies
2
Views
718
  • Calculus and Beyond Homework Help
Replies
1
Views
643
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
679
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
21
Views
1K
  • Classical Physics
Replies
0
Views
153
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
Back
Top