Statistics: E(X) = Integral(0 to infinity) of (1F(x))dxby kingwinner Tags: 1fxdx, infinity, integral0, statistics 

#1
Jan909, 03:59 PM

P: 1,270

"If X is nonnegative, then E(X) = Integral(0 to infinity) of (1F(x))dx, where F(x) is the cumulative distribution function of X."
============================ First of all, does X have to be a continuous random variable here? Or will the above result hold for both continuous and discrete random variable X? Secondly, the source that states this result gives no proof of it. I searched the internet but was unable to find a proof of it. I know that by definition, since X is nonnegative, we have E(X) = Integral(0 to infinity) of x f(x)dx where f(x) is the density function of X. What's next? Thanks for any help! 



#2
Jan909, 04:12 PM

Emeritus
Sci Advisor
PF Gold
P: 16,101





#3
Jan909, 05:07 PM

P: 626

This works for discrete, cont., and mixed. Though the derivative statement applies for cont. only. Use integration by parts for cont. case.




#4
Jan909, 05:46 PM

P: 1,270

Statistics: E(X) = Integral(0 to infinity) of (1F(x))dx
But for discrete random variable X, would it still make sense to talk about "integration"? (i.e.INTEGRAL(0 to infinity) of (1F(x))dx) Or should it be replaced by a (sigma) sum?
Do you mean using integration by parts for the expression of the definition of E(X)? What should I let u and dv be? Thanks! 



#5
Jan909, 06:18 PM

P: 626

No, in the discrete case you would be using sums instead of integrals since expectation is defined in terms of a sum not integrals. Well if u = 1  F(x) then du = f'(x) dx and dv = dx then v = x. So now you have x*S(x) (evalulated between your limits) + integral(x*f(x) dx). Obviously the first part of your sum vanishes since at "infinity" S(x) > 0 and at 0, x*S(x) = 0. And so now you are left with what your usual definition of E(X).
Note: this is a very handwavy proof as you would really want to be rigorous when talking about the limits that make the first term vanish. 



#6
Jan909, 06:31 PM

Emeritus
Sci Advisor
PF Gold
P: 16,101





#7
Jan909, 06:33 PM

Emeritus
Sci Advisor
PF Gold
P: 16,101





#8
Jan1709, 12:41 AM

P: 1,270

lim x(1F(x)) x>inf This actually gives "infinity times 0" which is an indeterminate form and requires L'Hopital's Rule. I tried many different ways but was still unable to figure out what the limit is going to be...how can we prove that the limit is equal to 0? Thanks! 



#9
Jan1709, 05:45 AM

Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,879

Hurkyl, you don't "need" measure theory to write a sum as an integral. The RiemannStieljes integral will do that.




#10
Jan1909, 10:48 AM

P: 100

Another way to do it is to write the expected value as
[tex]E[X]=\int_{0}^{\infty}sf(s)ds = \int_{s=0}^{\infty}\int_{x=0}^{s}f(s)dxds[/tex] and then change the order of the integrals to get your formula. To see what the new bounds on the integrals would be, draw a picture of the region of integration. You can use this same approach to find that [tex]E[X^2] = \int_{0}^{\infty}s^2f(s)ds = \int_{s=0}^{\infty}\int_{x=0}^{s}2xf(s)dxds= \int_{0}^{\infty}2x(1F(x))dx [/tex] which is also valid for X nonnegative. 



#11
Jan2009, 09:15 AM

P: 159





#12
Apr2909, 09:45 PM

P: 1

Integration by parts:
[tex]m_X=\int^{\infty}_{0} x f_X(x) dx = \int^{\infty}_{0} x (f_X(x) dx)[/tex] eq(1) Let [tex] u=x [/tex] and [tex]dv = f_X(x) dx [/tex] Thus [tex] du=dx [/tex] and [tex] v = 1F_X(x) [/tex] Chech that [tex] dv/dx = d/dx (1F_X(x)) = d/dx(F_X(x)) = f_X(x) [/tex] o.k. Then substitute in (1) [tex]m_X=[uv^{\infty}_{0}\int^{\infty}_{0}vdu] [/tex] [tex]m_X=[x[1F_X(x)]^{\infty}_{0}]+\int^{\infty}_{0}[1F_X(x)]dx [/tex] The first term is zero at x = 0. As [tex] x\rightarrow\infty, 1F_X(x)[/tex] tends to zero faster than the increase of [tex] x [/tex] and thus [tex] x[1F_X(x)]\rightarrow0 [/tex] Therefore [tex]m_X=\int^{\infty}_{0}[1F_X(x)]dx [/tex] QED Enjoy! 



#13
Apr2909, 11:25 PM

P: 5

[tex]
\begin{align*} E[X] &= E\bigg[\int_0^X 1\,dx\bigg]\\ &= E\bigg[\int_0^\infty 1_{\{X>x\}}\,dx\bigg]\\ &= \int_0^\infty E[1_{\{X>x\}}]\,dx\\ &= \int_0^\infty P(X > x)\,dx\\ &= \int_0^\infty (1  F(x))\,dx \end{align*} [/tex] By the way, this formula is true no matter what kind of random variable X is, and we do not need anything more than freshman calculus to understand the integral on the righthand side. (We need neither measure theory nor Stieltjes integrals.) Even when X is discrete, the function 1  F(x) is still at least piecewise continuous, so the integral makes perfectly good sense, even when understood as a good oldfashioned Riemann integral. 


Register to reply 
Related Discussions  
Integral Limits: Infinity.  Calculus & Beyond Homework  5  
Integral of e^(as^2)cos(Bs)ds from 0infinity  Calculus & Beyond Homework  5  
Integral of a delta function from infinity to 0 or 0 to +infinity  Quantum Physics  31  
Limits As X Approaches Infinity and Negative Infinity  Calculus & Beyond Homework  15  
A sinusoid integrated from infinity to infinity  Calculus & Beyond Homework  4 