The expectation value of superimposed probability functions

Click For Summary

Discussion Overview

The discussion revolves around the calculation of the expectation value when two probability functions are superimposed. Participants explore various methods for determining the resulting expectation value, including the use of convolution and linearity of expectations. The context includes statistical reasoning rather than strictly quantum mechanics.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant asks how to find the resulting expectation value when a second probability function is superimposed on an initial one, questioning whether to average the expectation values or use convolution.
  • Another participant seeks clarification on the term "superimposed," suggesting that the expectation values could be added or averaged based on the total integral of the functions.
  • A participant presents a mathematical formulation involving independent variables and convolution, expressing uncertainty about its correctness.
  • Another participant notes that the expectation of the sum of two random variables can be expressed as the sum of their individual expectations, emphasizing that this holds without the need for independence.
  • One participant suggests that using convolution is a valid approach but may be more complex than necessary, advocating for the use of linearity of expectations instead.
  • There is a discussion about whether to perform a weighted average of two expectation values or simply average them, with one participant proposing a specific formulation.
  • A later reply emphasizes the simplicity of defining the new variable Z and taking the expectation of both sides, reinforcing the clarity needed in the definitions used.
  • Concerns are raised about the notation used in a mathematical equation presented, with a participant pointing out issues related to the application of differential notation to random variables.
  • One participant acknowledges a notational correction regarding the equation presented earlier in the thread.

Areas of Agreement / Disagreement

Participants express differing views on the methods for calculating the expectation value, with no consensus reached on the best approach. Some participants advocate for linearity of expectations, while others explore convolution and averaging methods.

Contextual Notes

There are unresolved issues regarding the notation and definitions used in the mathematical formulations, as well as the clarity of the terms employed in the discussion.

redtree
Messages
335
Reaction score
15
I apologize for the simplicity of the question (NOT homework). This is a statistical question (not necessarily a quantum mechanical one).

If I have an initial probability function with an associated expected value and then a second probability function is superimposed on the initial probability function, how do I find resulting expectation value? Do I simply average the two expectation values (how do I weight?)? Would I use the convolution of the two probability functions to calculate the combined probability density function and then use that to calculate the resulting expected value? Or something else?
 
Physics news on Phys.org
What exactly do you mean by "superimposed"?
Added? Average them, with the weight coming from the total integral of the functions.
Convoluted? Add the expectation values.
 
This is what I mean, though I am not sure if it is correct:

Given, where ##\vec{X}## and ##\vec{Y}## are independent variables:

\begin{equation}

\begin{split}

\vec{Z}&=\vec{X}+\vec{Y}

\end{split}

\end{equation}Where the probability density function of ##\vec{X}## is given by ##f_{\vec{X}}(\vec{X})## and the probability density function of ##\vec{Y}## is given by ##f_{\vec{Y}}(\vec{Y})##, such that:

\begin{equation}

\begin{split}

f_{\vec{Z}}(\vec{Z})&=\int_{-\infty}^{\infty} d\vec{X} \int_{-\infty}^{\vec{Y}} d\vec{Y} f(\vec{X},\vec{Y})

\end{split}

\end{equation}Where:

\begin{equation}

\begin{split}

\int_{-\infty}^{\infty} d\vec{X} \int_{-\infty}^{\vec{Y}} d\vec{Y} f(\vec{X},\vec{Y})&=\int_{-\infty}^{\infty} d\vec{X} f_{\vec{Y}}(\vec{Y})f_{\vec{X}}(\vec{X})

\end{split}

\end{equation}Given ##\vec{Y}=\vec{Z}-\vec{X}##:

\begin{equation}

\begin{split}

\int_{-\infty}^{\infty} d\vec{X} f_{\vec{Y}}(\vec{Y})f_{\vec{X}}(\vec{X})&=\int_{-\infty}^{\infty} d\vec{X} f_{\vec{Y}}(\vec{Z}-\vec{X})f_{\vec{X}}(\vec{X})

\end{split}

\end{equation}Which is the convolution ##f(f_{\vec{X}}*f_{\vec{Y}})##, such that:

\begin{equation}

\begin{split}

f_{\vec{Z}}(\vec{Z})&=(f_{\vec{X}}*f_{\vec{Y}})(\vec{Z})

\end{split}

\end{equation}
Given:

\begin{equation}

\begin{split}

E_{\vec{Z}}(\vec{Z})=\int_{-\infty}^{\infty} d\vec{Z} f_{\vec{Z}}(\vec{Z}) \vec{Z}

\end{split}

\end{equation}Such that:

\begin{equation}

\begin{split}

E_{\vec{Z}}(\vec{Z})=\int_{-\infty}^{\infty} d\vec{Z} (f_{\vec{X}}*f_{\vec{Y}})(\vec{Z}) \vec{Z}

\end{split}

\end{equation}
 
redtree said:
This is what I mean, though I am not sure if it is correct:

Given, where ##\vec{X}## and ##\vec{Y}## are independent variables:

\begin{equation}

\begin{split}

\vec{Z}&=\vec{X}+\vec{Y}

\end{split}

\end{equation}
Then E(Z)=E(X)+E(Y). This is a well-known result (and it doesn't even need independence). What is new?
 
  • Like
Likes   Reactions: StoneTemplePython
redtree said:
Would I use the convolution of the two probability functions to calculate the combined probability density function and then use that to calculate the resulting expected value?

This is technically valid approach, but involves more work than needed.

redtree said:
Given, where ##\vec{X}## and ##\vec{Y}## are independent variables:

\begin{equation}

\begin{split}

\vec{Z}&=\vec{X}+\vec{Y}

\end{split}

\end{equation}

Note: The this right here is a convolution. But rather than going through the weeds of the underlying calculations, you can make use of the Linearity of Expectations and find

##E[Z] = E[ X + Y] = E[X] + E[Y]##

redtree said:
This is what I mean, though I am not sure if it is correct:

Where the probability density function of ##\vec{X}## is given by ##f_{\vec{X}}(\vec{X})## and the probability density function of ##\vec{Y}## is given by ##f_{\vec{Y}}(\vec{Y})##, such that:

\begin{equation}

\begin{split}

f_{\vec{Z}}(\vec{Z})&=\int_{-\infty}^{\infty} d\vec{X} \int_{-\infty}^{\vec{Y}} d\vec{Y} f(\vec{X},\vec{Y})

\end{split}

\end{equation}

This isn't a convolution... I'm not really sure what it is. The below would be the convolution of the two random variables written out in integral form. But again, using linearity of expectations makes your life a lot easier.

##f_Z(z) = \int_{-\infty}^{\infty} f_X(x) f_Y(z-x) dx##

## \int_{-\infty}^{\infty} z f_Z(z) = E[Z] = E[ X + Y] = E[X] + E[Y]##
 
Nothing new. I just want to make sure I'm understanding correctly.

In this context, am I correct in the following:

If I want to find the average of two expectation values, such that:

\begin{equation}

\begin{split}

E_{\vec{Z}}(\vec{Z})&=\text{Avg}\left(E_{\vec{X}}(\vec{X}),E_{\vec{Y}}(\vec{Y}) \right)

\end{split}

\end{equation}Do I need to perform a weighted average or can I just assume the following?:

\begin{equation}

\begin{split}

E_{\vec{Z}}(\vec{Z})&=\frac{E_{\vec{X}}(\vec{X})+E_{\vec{Y}}(\vec{Y})}{2}

\end{split}

\end{equation}Such that:

\begin{equation}

\begin{split}

\vec{Z}&=\frac{\vec{X}+\vec{Y}}{2}

\end{split}

\end{equation}
 
If your goal is to define Z

such that ##Z := \frac{1}{2} \big(X + Y\big) = \frac{1}{2} X + \frac{1}{2} Y##

then take the expectation of both sides and see

##E[Z] =E[\frac{1}{2} X + \frac{1}{2} Y] =\frac{1}{2} E[X] + \frac{1}{2} E[Y]##

It really is that simple. But you need to be very clear on what your goal is, and define Z accordingly. (The wording in this thread has not been so clear.)
 
I want to make sure I understand correctly how to find the average of two expectation values. My sense was exactly what you just stated, but I wasn't sure.
 
By the way, you had a question about the equation:

\begin{equation}
\begin{split}
f_{\vec{Z}}(\vec{Z})&=\int_{-\infty}^{\infty} d\vec{X} \int_{-\infty}^{\vec{Y}} d\vec{Y} f(\vec{X},\vec{Y})
\end{split}
\end{equation}

I got that from the following source: http://statweb.stanford.edu/~susan/courses/s116/node114.html
 
  • #10
Probably should have written it as follows:

\begin{equation}
\begin{split}
f_{\vec{Z}}(\vec{Z})&=\int_{-\infty}^{\infty} d\vec{Y} \int_{-\infty}^{\vec{Y}} d\vec{X} f(\vec{X},\vec{Y})
\end{split}
\end{equation}
 
  • #11
There's still a lot of problems with that equation, as you've written it. (Notice that line 1 of the Stanford link is basically how I wrote it.)

You have your ##d## being applied to a vector valued random variable. Ignoring the vector notation, you are using capital letters which denotes the random variable itself and you've applied the "d" to those random variables and are using a random variable as a limit of your inner integral. Lower case letters are specific values that the random variable can take on as in ##X(\omega) = x## or in more common shorthand for each ##X = x##.

I.e. what you wrote is materially different than what's in that link, and from what I can tell it's not well defined.
 
  • #12
Thanks for the notational correction.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 3 ·
Replies
3
Views
851
  • · Replies 76 ·
3
Replies
76
Views
7K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K