How Does Changing Variables Affect the Expected Value in Probability Theory?

Click For Summary
SUMMARY

This discussion focuses on the application of the Law of the Unconscious Statistician in probability theory, specifically how changing variables affects expected values. Key formulas for expectation are provided from Hoel's "An Introduction to Mathematical Statistics," including E[X] = ∫ab x f(x) dx and E[g(X)] = ∫ab g(x) f(x) dx. The conversation explores the relationship between the probability density functions of random variables and their transformations, emphasizing the importance of integration techniques. Sheldon Ross's "A First Course in Probability" is referenced for a straightforward proof of expected value calculations.

PREREQUISITES
  • Understanding of probability density functions (pdf) and cumulative distribution functions (cdf).
  • Familiarity with integration techniques, particularly integration by substitution.
  • Knowledge of the Law of the Unconscious Statistician.
  • Basic concepts of random variables and their transformations.
NEXT STEPS
  • Study the Law of the Unconscious Statistician in detail.
  • Learn about integration techniques in calculus, focusing on substitution methods.
  • Explore the proofs and applications of expected values in Sheldon Ross's "A First Course in Probability."
  • Investigate the implications of changing variables in probability distributions.
USEFUL FOR

Students and professionals in statistics, mathematicians, and anyone interested in deepening their understanding of probability theory and expected value calculations.

Rasalhague
Messages
1,383
Reaction score
2
Hoel: An Introduction to Mathematical Statistics introduces the following formulas for expectation, where the density is zero outside of the interval [a,b].

E\left [ X \right ] = \int_{a}^{b} x f(x) \; dx

E\left [ g(X) \right ] = \int_{a}^{b} g(x) f(x) \; dx

He says, "Let the random variable g(X) be denoted by Y. Then knowing the density f(x) of X it is theoretically possible to to find the density h(x) of Y. The expected value of g(X) is the same as the expected value of Y; therefore if h(y) is available, the latter expected value can be expressed in the form

E\left [ Y \right ] = \int_{-\infty}^{\infty} y h(y) \; dy.

"By using the change of variable techniques of calculus, it can be shown that this value is the same as the value given by (22) [the 2nd formula I've quoted in this post]."


I've been trying to do this. Let I denote the identity function on \mathbb{R}. Let f_X denote the pdf of the distribution induced by a random variable X, and F_X its cdf. I'm guessing that when the expected value of a distribution is expressed like this in terms of a random variable, E[X] is to be understood as E[P_X], and E[g(X)] as E[P_{g \circ X}], where P_X means the distribution induced by the random variable X, given some sample space implicit in the context.

Then expectation is defined by

E[P_X]=\int_a^b I \cdot f_X,

and we must show that

\int_a^b I \cdot f_{g \circ X} = \int_a^b g \cdot f_X,

or do the limits need to be changed? Using the chain rule (integration by substitution) and the identity

F_{g \circ X}=F_X \circ g,

leads me to

\int_a^b I \cdot f_{g \circ X} = \int_{g(a)}^{g(b)} I \cdot f_X

which looks tantalisingly close, but am I going in the right direction?
 
Physics news on Phys.org
This is sometimes called The Law of the Unconscious Statistician, so you might try looking for sources. I'm not sure how to use your approach, so I'll give a slightly different one. In Sheldon Ross's A First Course in Probability, he shows this by first proving the lemma
<br /> \mathbf{E}[Y] =\int_0^\infty \mathbf{P}\{Y &gt; y \} \, dy - \int_0^\infty \mathbf{P}\{Y &lt; -y \} \, dy<br />
for any random variable Y. (This is a pretty straightforward proof: just switch the order of integration using the pdf for Y.) After that, he sets Y = g(X) and switching the order of integration once more, the result falls out. I can go into more detail if you'd like, but I hope this helps!
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 26 ·
Replies
26
Views
1K