# Statistics proof

1. Jan 27, 2014

### joshmccraney

1. The problem statement, all variables and given/known data
so i was oping to prove the following are equivalent: $$\int_{-\infty}^{\infty}c^m B_x(c)dc=\lim_{n \to \infty}\frac{1}{n}\sum_{i=1}^{n}x_i^m$$

2. Relevant equations
$$\int_{-\infty}^{\infty}B_x(c)dc=1$$
$x_i$ is a random variable.
$B_x(c)$ is a pdf

3. The attempt at a solution
i was going to write the first integral as a sum and equate the two summations as: $$\lim_{n \to \infty}\sum_{i=1}^{n}\underbrace{\Big(-n+\frac{n-(-n)}{n}\Big)^m B_x \Big(-n+\frac{n+(-n)}{n}\Big)}_{f(x_i)}\underbrace{\frac{n-(-n)}{n}}_{\Delta x}$$ $x_i=a+\frac{b-a}{n}$ if we are using the standard definition of the reimann integral. from here i'm not really sure what to do next. any help is appreciated. also, if i need to provide more info please let me know.

for the record, this is not an assignment but is a problem i found in my text and was curious.

thanks!

Last edited: Jan 27, 2014
2. Jan 27, 2014

### haruspex

You do realise you are labelling two unrelated entities as xi, yes? And what is this "n-(-n)" notation? How is that not just 2n?

3. Jan 28, 2014

### joshmccraney

yes, i realize this but i thought it would be somewhat clear, assuming a good familiarity with the definition of reimann integral (did i assume too much?)

shoot, i should have added the "i" in the n-(-n) term. thanks for pointing this out:

$$\lim_{n \to \infty}\sum_{i=1}^{n}\underbrace{\Big(-n+\frac{n-(-n)}{n}i\Big)^m B_x \Big(-n+\frac{n+(-n)}{n}i\Big)}_{f(x_i)}\underbrace{\frac{n-(-n)}{n}}_{\Delta x}$$

and sure, while automatic let's simplify the above with the n's: $$\lim_{n \to \infty}\sum_{i=1}^{n}\underbrace{\Big(2i-n\Big)^m B_x \Big(2i-n\Big)}_{f(x_i)}\underbrace{2}_{\Delta x}$$ but what do you recommend now (the tough part)? i think i'll avoid the binomial theorem (too many sums), and the $B$ function seems onerous still (and arcane). have i left something unclear?

4. Jan 28, 2014

### haruspex

I still don't get how the xi can represent random variables in the first sum.
Don't you need some reference to expected values to match it up with a numerical LHS?
And I fail to see why the limit of a sum of random variables should be related to a Riemann integral being a limit of a sum.
The moment generating function should be useful.

5. Jan 28, 2014

### joshmccraney

6. Jan 28, 2014

### Ray Vickson

I think the material you cite is nonsense, at least as it is written. It can be rescued, however. If $X_1, X_2, \ldots$ is a sample of independent, identically-distributed random variables (basically, random "copies" ) and their common pdf is $B(.)$, then we almost-surely have
$$EX^m \equiv \int_{-\infty}^{\infty} x^m B(x) \, dx = \lim_{n \to \infty} \frac{1}{n} \sum_{i=1}^{n} X_i^m$$
This is just the (strong) Law of Large Numbers applied to the random variables $Z_i = X_i^m$.

Note: in the above, the left-hand-side is a non-random number, while the right-hand-side is a sample of some random variable. The strong law of large numbers says (among other things) that the sample random variable on the right is equal to the number on the left except for some event having probability zero. Note also that the x in the integration is just a dummy variable of integration having nothing to do with the X on the right; you could call it c instead, as you did.

7. Jan 28, 2014

### joshmccraney

i'm glad you're familiar with it (at least, after you make corrections). how would i begin proving the statement you've written (not the definition, just the equality)? also, it bothers me that im unfamiliar and confused with this statement and how it is necessary:

i feel i should understand this how you do, but i don't. can you elaborate or direct me to a source?

thanks!

8. Jan 28, 2014