Proving Equivalence of Integrals: A Statistical Approach

  • Thread starter Thread starter member 428835
  • Start date Start date
  • Tags Tags
    Proof Statistics
member 428835

Homework Statement


so i was oping to prove the following are equivalent: \int_{-\infty}^{\infty}c^m B_x(c)dc=\lim_{n \to \infty}\frac{1}{n}\sum_{i=1}^{n}x_i^m

Homework Equations


\int_{-\infty}^{\infty}B_x(c)dc=1
x_i is a random variable.
B_x(c) is a pdf

The Attempt at a Solution


i was going to write the first integral as a sum and equate the two summations as: \lim_{n \to \infty}\sum_{i=1}^{n}\underbrace{\Big(-n+\frac{n-(-n)}{n}\Big)^m B_x \Big(-n+\frac{n+(-n)}{n}\Big)}_{f(x_i)}\underbrace{\frac{n-(-n)}{n}}_{\Delta x} x_i=a+\frac{b-a}{n} if we are using the standard definition of the reimann integral. from here I'm not really sure what to do next. any help is appreciated. also, if i need to provide more info please let me know.

for the record, this is not an assignment but is a problem i found in my text and was curious.

thanks!
 
Last edited by a moderator:
Physics news on Phys.org
You do realize you are labelling two unrelated entities as xi, yes? And what is this "n-(-n)" notation? How is that not just 2n?
 
yes, i realize this but i thought it would be somewhat clear, assuming a good familiarity with the definition of reimann integral (did i assume too much?)

shoot, i should have added the "i" in the n-(-n) term. thanks for pointing this out:

\lim_{n \to \infty}\sum_{i=1}^{n}\underbrace{\Big(-n+\frac{n-(-n)}{n}i\Big)^m B_x \Big(-n+\frac{n+(-n)}{n}i\Big)}_{f(x_i)}\underbrace{\frac{n-(-n)}{n}}_{\Delta x}

and sure, while automatic let's simplify the above with the n's: \lim_{n \to \infty}\sum_{i=1}^{n}\underbrace{\Big(2i-n\Big)^m B_x \Big(2i-n\Big)}_{f(x_i)}\underbrace{2}_{\Delta x} but what do you recommend now (the tough part)? i think i'll avoid the binomial theorem (too many sums), and the B function seems onerous still (and arcane). have i left something unclear?
 
I still don't get how the xi can represent random variables in the first sum.
Don't you need some reference to expected values to match it up with a numerical LHS?
And I fail to see why the limit of a sum of random variables should be related to a Riemann integral being a limit of a sum.
The moment generating function should be useful.
 
joshmccraney said:
the original is here if you want to take a look: http://www.turbulence-online.com/Publications/Lecture_Notes/Turbulence_Lille/TB_16January2013.pdf top of page 26. it's confusing to me so i thought id run it by you guys but if you're stumped then i'll call it.

let me know if you see something i missed but if not, thanks for trying!

I think the material you cite is nonsense, at least as it is written. It can be rescued, however. If ##X_1, X_2, \ldots ## is a sample of independent, identically-distributed random variables (basically, random "copies" ) and their common pdf is ##B(.)##, then we almost-surely have
EX^m \equiv \int_{-\infty}^{\infty} x^m B(x) \, dx = <br /> \lim_{n \to \infty} \frac{1}{n} \sum_{i=1}^{n} X_i^m
This is just the (strong) Law of Large Numbers applied to the random variables ##Z_i = X_i^m##.

Note: in the above, the left-hand-side is a non-random number, while the right-hand-side is a sample of some random variable. The strong law of large numbers says (among other things) that the sample random variable on the right is equal to the number on the left except for some event having probability zero. Note also that the x in the integration is just a dummy variable of integration having nothing to do with the X on the right; you could call it c instead, as you did.
 
i'm glad you're familiar with it (at least, after you make corrections). how would i begin proving the statement you've written (not the definition, just the equality)? also, it bothers me that I am unfamiliar and confused with this statement and how it is necessary:

Ray Vickson said:
If ##X_1, X_2, \ldots ## is a sample of independent, identically-distributed random variables (basically, random "copies" ) and their common pdf is ##B(.)##, then we almost-surely have
EX^m \equiv \int_{-\infty}^{\infty} x^m B(x) \, dx = <br /> \lim_{n \to \infty} \frac{1}{n} \sum_{i=1}^{n} X_i^m

i feel i should understand this how you do, but i don't. can you elaborate or direct me to a source?

thanks!
 
joshmccraney said:
i'm glad you're familiar with it (at least, after you make corrections). how would i begin proving the statement you've written (not the definition, just the equality)? also, it bothers me that I am unfamiliar and confused with this statement and how it is necessary:



i feel i should understand this how you do, but i don't. can you elaborate or direct me to a source?

thanks!

Google "Law of Large Numbers".
 

Similar threads

Back
Top