Expectation of ratio of 2 independent random variables ?

Click For Summary
SUMMARY

The expectation of the ratio of two independent random variables, E[x/y], does not equal the ratio of their expectations, E[x]/E[y]. Instead, the correct relationship is E[x/y] = E[x] * E[1/y]. This is due to the fact that E[1/y] does not equal 1/E[y], as demonstrated by examples involving uniform random variables. Specifically, if y is a uniform random variable between 1 and 2, E[1/y] results in ln(2), which is not equal to 1.5. This conclusion is supported by Jensen's inequality, indicating that E[1/y] is always greater than 1/E[y] unless y has no randomness.

PREREQUISITES
  • Understanding of expectation operators in probability theory
  • Knowledge of Jensen's inequality and its implications
  • Familiarity with properties of independent random variables
  • Basic calculus for evaluating integrals
NEXT STEPS
  • Study the properties of expectation operators in probability theory
  • Explore Jensen's inequality and its applications in statistics
  • Learn about the behavior of uniform random variables and their expectations
  • Investigate the implications of convex functions in probability distributions
USEFUL FOR

Mathematicians, statisticians, and data scientists who are involved in probability theory and wish to deepen their understanding of expectation relationships among random variables.

nikozm
Messages
51
Reaction score
0
Hi,

i was wondering if the following is valid:

E[x/y] = E[x] / E[y], given that {x,y} are non-negative and independent random variables and E[.] stands for the expectation operator.

Thanks
 
Physics news on Phys.org
No, this is not true. It is true that E[x/y] = E[x]*E[1/y] but it is not true that E[1/y] = 1/E[y]. For example if y is a uniform random variable taking values between 0 and 1,
[tex]E[1/y] = \int_{0}^{1} \frac{1}{y} dy = \infty.[/tex]
Even if you restrict yourself away from zero to avoid stupid division problems, if y is a uniform random variable between 1 and 2,
[tex]E[1/y] = \int_{1}^{2} \frac{1}{y} dy = \ln(2) \neq 1.5[/tex]
 
More can be said. Given [itex]y[/itex] is [itex]>0[/itex]-valued with finite expectation, you will always have [itex]\mathbb E\left[\dfrac1y\right] > \dfrac1{\mathbb E[y]}[/itex], except in the extreme case that [itex]y[/itex] exhibits no randomness. This follows from Jensen's inequality, since the function [itex]t\mapsto \dfrac1t[/itex] is strictly convex on [itex](0,\infty)[/itex].
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K