Solving for Expected Value: Independent Variables and Nonnegative Functions

Click For Summary

Homework Help Overview

The discussion revolves around the expected value of functions of independent random variables, specifically focusing on the relationship between the expected value of a function of one variable and the joint function of two variables. The original poster seeks to demonstrate that the expectation of a derived function equals the expectation of a joint function, with particular attention to cases where the function may take negative values.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants explore the definition of expected value and the implications of independence between random variables. There are questions about the interpretation of the function g(x) and whether it relates to marginal distributions. Some participants express confusion regarding the notation and the role of non-negativity in the context of probability distributions.

Discussion Status

The discussion is ongoing, with participants providing insights into the mathematical framework and questioning the assumptions made in the original post. Some guidance has been offered regarding the interpretation of expectations and the notation used, but there is no clear consensus on the extension to negative functions or the specific integral formulation requested by the original poster.

Contextual Notes

There is a noted ambiguity regarding the notation used for expectations and the differential term dP. Participants are also considering the implications of non-negativity in the context of probability measures and the definitions being used.

empyreandance
Messages
15
Reaction score
0
Hello everyone,

I have the following question. Suppose that X and Y are independent and f(x,y) is nonnegative. Put g(x)=E[f(x,Y)] and show E[g(X)]=E[f(X,Y)]. Show more generally that Integral over X in A of g(X) dP = Integral over X in A of f(X,Y) dP. Extend to f that may be negative. I've had no issues, except with the extension to negative f part. Any suggestions?
 
Physics news on Phys.org
empyreandance said:
Hello everyone,

I have the following question. Suppose that X and Y are independent and f(x,y) is nonnegative. Put g(x)=E[f(x,Y)] and show E[g(X)]=E[f(X,Y)]. Show more generally that Integral over X in A of g(X) dP = Integral over X in A of f(X,Y) dP. Extend to f that may be negative. I've had no issues, except with the extension to negative f part. Any suggestions?

Hello empyreandance and welcome to the forums.

I'm a little confused with your question. You mention that you have X and Y which are independent.

When you take expectation, it has to be with respect to a particular variable, especially with regard to situations where you have more than one variable.

In this situation you could do an expectation with respect to Y and integrate (or summate) out the Y terms to get a function of X, and then calculate the expectation of this to get an actual value, but you are not doing this.

Also your differential term is dP. Does dP refer to dYdX?

Also with regard to f(X,Y) being non-negative, this must be the case for any valid probability distribution (since probabilities are always between 0 and 1 for a discrete value or for an interval on the continuous distribution).

So based on the above comments, your question does not make sense.
 
Hi chiro,

Thanks for the reply.

I am working with the following definition:

E[X] = Integral of X dP (P is the probability measure on the space) = Integral X(ω)P(dω)

f(X,Y) is not the density function; it is simply a function applied to random variables, giving a composition of functions when one regards a random variable as a real-valued function.

Does this clear up the confusion?
 
When you write g(x)=E[f(x,Y)], do you mean g(x)=E[f(y|x)], where f(y|x) is the marginal distribution of y as a function of x?
 
skwey said:
When you write g(x)=E[f(x,Y)], do you mean g(x)=E[f(y|x)], where f(y|x) is the marginal distribution of y as a function of x?

As the OP stated in the post above yours, f(x,y) is an arbitrary function of the random variables. It is not a density function whatsoever.

With regard to the question, I'll admit I am not really familiar with the measure-theory notation. In usual notation, I would say that the random variable Y has a probability density \rho_Y(y) and the rv X has density \rho_X(x). Then,

\mathbb{E}_Y[f(x,y] = \int_{-\infty}^\infty dy~f(x,y)\rho_Y(y) \equiv g(x),

and

\mathbb{E}_X[g(x)] = \int_{-\infty}^\infty dx~\rho_X(x) g(x) = \int_{-\infty}^\infty dx\int_{-\infty}^\infty dy~\rho_X(x)\rho_Y(y)f(x,y) = \mathbb{E}_{XY}[f(x,y)]

So, this is what I would have done to show E[g(x)] = E[f(x,y)], where I added subscripts to make it clear what variables were being averaged over. For the statement,

"Show more generally that Integral over X in A of g(X) dP = Integral over X in A of f(X,Y) dP"

I am afraid I don't know what this is trying to ask. It seems to me like it is the same as the first part.

Lastly, with regard to f(x,y) having negative values, I never really used the fact that f(x,y) was non-negative above. I don't know how different a measure-theoretic argument would be, so I can't guess at where the non-negativity of f(x,y) came into play.
 

Similar threads

Replies
2
Views
2K
  • · Replies 13 ·
Replies
13
Views
13K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
10
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K