Expected value of a function of a random variable

Click For Summary
The discussion revolves around deriving the inequality P(X≥x) ≤ Eg(X)/g(x) for a random variable X, where g(x) is positive and strictly increasing. Participants suggest using Markov's Inequality and indicator random variables to simplify the proof, emphasizing that the problem can be approached without distinguishing between discrete and continuous cases. One participant proposes defining a new random variable Y = g(X) to apply Markov's Inequality effectively. The conversation highlights the utility of concrete functions, like the exponential function, to clarify the proof process. The consensus is that the proof can be streamlined using basic properties of expectation applicable to all types of random variables.
AllRelative
Messages
42
Reaction score
2

Homework Statement


Let X be a random variable. It is not specified if it is continuous or discrete. Let g(x) alway positive and strictly increasing. Deduce this inequality:
$$P(X\geqslant x) \leqslant \frac{Eg(X)}{g(x)} \: $$
where x is real.

Homework Equations


I know that if X is discrete
$$E(X) =\sum_{n=1}^{\infty} g(x_i)x_i$$

And if X is continuous,
$$\int_{-\infty}^{\infty} g(x)f(x) dx$$

The Attempt at a Solution


Is there a way to answer the question without proving the two cases (cont and discrete). Thanks!
 
Last edited:
Physics news on Phys.org
AllRelative said:

The Attempt at a Solution


Is there a way to answer the question without proving the two cases (cont and discrete). Thanks!

Yes -- use indicator random variables and recognize that your problem is actually asking for a standard proof of Markov's Inequality which takes just one or two lines.

Note: if you want something more concrete, let ##g## be the exponential function.
 
  • Like
Likes AllRelative
StoneTemplePython said:
Yes -- use indicator random variables and recognize that your problem is actually asking for a standard proof of Markov's Inequality which takes just one or two lines.

Note: if you want something more concrete, let ##g## be the exponential function.
I just read up of Markov's inequality. I see that it is the same problem except for the function. I'm just unsure about what to do with the function g.

$$g(x) P(X)\geq E(g(I_{X\geq x}))$$

How does an idicator function behave in a function? That's what confuses me. If g wasn't there I could finish the proof...
 
AllRelative said:
I just read up of Markov's inequality. I see that it is the same problem except for the function. I'm just unsure about what to do with the function g.

$$g(x) P(X)\geq E(g(I_{X\geq x}))$$

How does an idicator function behave in a function? That's what confuses me. If g wasn't there I could finish the proof...

some of the things look backwards here?
- - - -
Suggestion: break it into two parts.

(part 1) let random variable ##Y := g(X)##. Now prove markov's inequality for ##Y##.

(part 2) after you have done the above, reason through -- how can you relate Y and X? i.e. if I say an experiment occurs in the sample space and I know the outcome is ##Y(\omega) \geq g(c)## that immediately tells you something of interest about ##X(\omega)##. Again, de-abstracting this and having ##g## be the exponential function may be useful for now...
 
  • Like
Likes AllRelative
Oh right... Thanks man!
 
AllRelative said:

Homework Statement


Let X be a random variable. It is not specified if it is continuous or discrete. Let g(x) alway positive and strictly increasing. Deduce this inequality:
$$P(X\geqslant x) \leqslant \frac{Eg(X)}{g(x)} \: $$
where x is real.

Homework Equations


I know that if X is discrete
$$E(X) =\sum_{n=1}^{\infty} g(x_i)x_i$$

And if X is continuous,
$$\int_{-\infty}^{\infty} g(x)f(x) dx$$

The Attempt at a Solution


Is there a way to answer the question without proving the two cases (cont and discrete). Thanks!

Now that you have done the question I can show you may favorite, quick way of doing it. For ##a > 0## let ##v_a(x) = g(x)/g(a)## and let
$$u_a(x) = 1\{ x \geq a \} = \begin{cases} 0 & \text{if} \; x < a \\
1 & \text{if} \; x \geq a
\end{cases}$$
For all ##x \geq 0## we have ##0 \leq u_a(x) \leq v_a(x)##, with ##u_a(a) = v_a(a) = 1.## Thus
$$P(X \geq a) = E u_a(X) \leq E v_a(X) = E g(X)/g(a).$$

This just uses elementary properties of expectation, and works the same way whether ##X## is discrete, continuous, or mixed discrete-continuous.

Here is a drawing that shows the situation.
 

Attachments

Last edited:
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 13 ·
Replies
13
Views
13K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
Replies
7
Views
2K