Statistics: E(X) = Integral(0 to infinity) of (1-F(x))dx


by kingwinner
Tags: 1fxdx, infinity, integral0, statistics
kingwinner
kingwinner is offline
#1
Jan9-09, 03:59 PM
P: 1,270
"If X is non-negative, then E(X) = Integral(0 to infinity) of (1-F(x))dx, where F(x) is the cumulative distribution function of X."

============================

First of all, does X have to be a continuous random variable here? Or will the above result hold for both continuous and discrete random variable X?

Secondly, the source that states this result gives no proof of it. I searched the internet but was unable to find a proof of it. I know that by definition, since X is non-negative, we have E(X) = Integral(0 to infinity) of x f(x)dx where f(x) is the density function of X. What's next?

Thanks for any help!
Phys.Org News Partner Science news on Phys.org
Cougars' diverse diet helped them survive the Pleistocene mass extinction
Cyber risks can cause disruption on scale of 2008 crisis, study says
Mantis shrimp stronger than airplanes
Hurkyl
Hurkyl is offline
#2
Jan9-09, 04:12 PM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,101
Quote Quote by kingwinner View Post
"If X is non-negative, then E(X) = Integral(0 to infinity) of (1-F(x))dx, where F(x) is the cumulative distribution function of X."
...
E(X) = Integral(0 to infinity) of x f(x)dx where f(x) is the density function of X. What's next?
Well, the thing you know has an x in it, but the thing you're trying to get to doesn't... and the thing you're trying to get to has an F in it, but the thing you know has the derivative of F in it....
NoMoreExams
NoMoreExams is offline
#3
Jan9-09, 05:07 PM
P: 626
This works for discrete, cont., and mixed. Though the derivative statement applies for cont. only. Use integration by parts for cont. case.

kingwinner
kingwinner is offline
#4
Jan9-09, 05:46 PM
P: 1,270

Statistics: E(X) = Integral(0 to infinity) of (1-F(x))dx


But for discrete random variable X, would it still make sense to talk about "integration"? (i.e.INTEGRAL(0 to infinity) of (1-F(x))dx) Or should it be replaced by a (sigma) sum?

Do you mean using integration by parts for the expression of the definition of E(X)? What should I let u and dv be?

Thanks!
NoMoreExams
NoMoreExams is offline
#5
Jan9-09, 06:18 PM
P: 626
No, in the discrete case you would be using sums instead of integrals since expectation is defined in terms of a sum not integrals. Well if u = 1 - F(x) then du = -f'(x) dx and dv = dx then v = x. So now you have x*S(x) (evalulated between your limits) + integral(x*f(x) dx). Obviously the first part of your sum vanishes since at "infinity" S(x) -> 0 and at 0, x*S(x) = 0. And so now you are left with what your usual definition of E(X).

Note: this is a very handwavy proof as you would really want to be rigorous when talking about the limits that make the first term vanish.
Hurkyl
Hurkyl is offline
#6
Jan9-09, 06:31 PM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,101
Quote Quote by kingwinner View Post
But for discrete random variable X, would it still make sense to talk about "integration"? (i.e.INTEGRAL(0 to infinity) of (1-F(x))dx) Or should it be replaced by a (sigma) sum?
It does when you learn measure theory. Until then, just replace it with a sum without thinking about it.
Hurkyl
Hurkyl is offline
#7
Jan9-09, 06:33 PM
Emeritus
Sci Advisor
PF Gold
Hurkyl's Avatar
P: 16,101
Quote Quote by kingwinner View Post
What should I let u and dv be?
Did you think about that question at all? I practically told you what u and dv should be in post #2....
kingwinner
kingwinner is offline
#8
Jan17-09, 12:41 AM
P: 1,270
Quote Quote by NoMoreExams View Post
No, in the discrete case you would be using sums instead of integrals since expectation is defined in terms of a sum not integrals. Well if u = 1 - F(x) then du = -f'(x) dx and dv = dx then v = x. So now you have x*S(x) (evalulated between your limits) + integral(x*f(x) dx). Obviously the first part of your sum vanishes since at "infinity" S(x) -> 0 and at 0, x*S(x) = 0. And so now you are left with what your usual definition of E(X).

Note: this is a very handwavy proof as you would really want to be rigorous when talking about the limits that make the first term vanish.
Just one point I am having troubles with: (in red)

lim x(1-F(x))
x->inf
This actually gives "infinity times 0" which is an indeterminate form and requires L'Hopital's Rule. I tried many different ways but was still unable to figure out what the limit is going to be...how can we prove that the limit is equal to 0?

Thanks!
HallsofIvy
HallsofIvy is offline
#9
Jan17-09, 05:45 AM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,894
Hurkyl, you don't "need" measure theory to write a sum as an integral. The Riemann-Stieljes integral will do that.
Adeimantus
Adeimantus is offline
#10
Jan19-09, 10:48 AM
P: 100
Another way to do it is to write the expected value as

[tex]E[X]=\int_{0}^{\infty}sf(s)ds = \int_{s=0}^{\infty}\int_{x=0}^{s}f(s)dxds[/tex]

and then change the order of the integrals to get your formula. To see what the new bounds on the integrals would be, draw a picture of the region of integration. You can use this same approach to find that

[tex]E[X^2] = \int_{0}^{\infty}s^2f(s)ds = \int_{s=0}^{\infty}\int_{x=0}^{s}2xf(s)dxds=
\int_{0}^{\infty}2x(1-F(x))dx
[/tex]

which is also valid for X nonnegative.
Preno
Preno is offline
#11
Jan20-09, 09:15 AM
P: 159
Quote Quote by NoMoreExams View Post
No, in the discrete case you would be using sums instead of integrals since expectation is defined in terms of a sum not integrals.
Or, equivalently, write the probability distribution as a sum of delta functions.
lorta230
lorta230 is offline
#12
Apr29-09, 09:45 PM
P: 1
Integration by parts:

[tex]m_X=\int^{\infty}_{0} x f_X(x) dx = -\int^{\infty}_{0} x (-f_X(x) dx)[/tex] eq(1)

Let [tex] u=x [/tex] and [tex]dv = -f_X(x) dx [/tex]

Thus [tex] du=dx [/tex] and [tex] v = 1-F_X(x) [/tex]

Chech that [tex] dv/dx = d/dx (1-F_X(x)) = d/dx(-F_X(x)) = -f_X(x) [/tex] o.k.

Then substitute in (1)

[tex]m_X=-[uv|^{\infty}_{0}-\int^{\infty}_{0}vdu] [/tex]

[tex]m_X=-[x[1-F_X(x)]|^{\infty}_{0}]+\int^{\infty}_{0}[1-F_X(x)]dx [/tex]

The first term is zero at x = 0. As [tex] x\rightarrow\infty, 1-F_X(x)[/tex] tends to zero faster than the increase of [tex] x [/tex] and thus [tex] x[1-F_X(x)]\rightarrow0 [/tex]

Therefore

[tex]m_X=\int^{\infty}_{0}[1-F_X(x)]dx [/tex]

QED

Enjoy!
jason1995
jason1995 is offline
#13
Apr29-09, 11:25 PM
P: 5
[tex]
\begin{align*}
E[X] &= E\bigg[\int_0^X 1\,dx\bigg]\\
&= E\bigg[\int_0^\infty 1_{\{X>x\}}\,dx\bigg]\\
&= \int_0^\infty E[1_{\{X>x\}}]\,dx\\
&= \int_0^\infty P(X > x)\,dx\\
&= \int_0^\infty (1 - F(x))\,dx
\end{align*}
[/tex]

By the way, this formula is true no matter what kind of random variable X is, and we do not need anything more than freshman calculus to understand the integral on the right-hand side. (We need neither measure theory nor Stieltjes integrals.) Even when X is discrete, the function 1 - F(x) is still at least piecewise continuous, so the integral makes perfectly good sense, even when understood as a good old-fashioned Riemann integral.


Register to reply

Related Discussions
Integral Limits: Infinity. Calculus & Beyond Homework 5
Integral of e^(-as^2)cos(Bs)ds from 0-infinity Calculus & Beyond Homework 5
Integral of a delta function from -infinity to 0 or 0 to +infinity Quantum Physics 31
Limits As X Approaches Infinity and Negative Infinity Calculus & Beyond Homework 15
A sinusoid integrated from -infinity to infinity Calculus & Beyond Homework 4