Proving Equivalence of Integrals: A Statistical Approach

  • Thread starter Thread starter member 428835
  • Start date Start date
  • Tags Tags
    Proof Statistics
Click For Summary

Homework Help Overview

The discussion revolves around proving the equivalence of two integrals related to probability density functions and random variables. The original poster seeks to establish a relationship between an integral involving a probability density function \( B_x(c) \) and a limit of a sum of random variables raised to a power \( m \). The context is situated within statistics and probability theory, particularly focusing on the Law of Large Numbers.

Discussion Character

  • Exploratory, Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • The original poster attempts to rewrite the integral as a sum and equate it with a limit of a summation. Some participants question the notation used and the assumptions made about the random variables involved. Others suggest that a reference to expected values may be necessary to connect the two sides of the equation. There is also a discussion about the applicability of the Law of Large Numbers in this context.

Discussion Status

The discussion is ongoing, with participants providing feedback on the original poster's approach and notation. Some guidance has been offered regarding the Law of Large Numbers, but there is no explicit consensus on the next steps or the clarity of the original statement. Participants are exploring different interpretations and seeking further clarification on the concepts involved.

Contextual Notes

There are indications that the original poster may have made assumptions about familiarity with the Riemann integral and the definitions involved. Additionally, some participants express confusion about the relationship between the random variables and the integral, suggesting that more context or information may be needed to clarify the problem.

member 428835

Homework Statement


so i was oping to prove the following are equivalent: [tex]\int_{-\infty}^{\infty}c^m B_x(c)dc=\lim_{n \to \infty}\frac{1}{n}\sum_{i=1}^{n}x_i^m[/tex]

Homework Equations


[tex]\int_{-\infty}^{\infty}B_x(c)dc=1[/tex]
[itex]x_i[/itex] is a random variable.
[itex]B_x(c)[/itex] is a pdf

The Attempt at a Solution


i was going to write the first integral as a sum and equate the two summations as: [tex]\lim_{n \to \infty}\sum_{i=1}^{n}\underbrace{\Big(-n+\frac{n-(-n)}{n}\Big)^m B_x \Big(-n+\frac{n+(-n)}{n}\Big)}_{f(x_i)}\underbrace{\frac{n-(-n)}{n}}_{\Delta x}[/tex] [itex]x_i=a+\frac{b-a}{n}[/itex] if we are using the standard definition of the reimann integral. from here I'm not really sure what to do next. any help is appreciated. also, if i need to provide more info please let me know.

for the record, this is not an assignment but is a problem i found in my text and was curious.

thanks!
 
Last edited by a moderator:
Physics news on Phys.org
You do realize you are labelling two unrelated entities as xi, yes? And what is this "n-(-n)" notation? How is that not just 2n?
 
yes, i realize this but i thought it would be somewhat clear, assuming a good familiarity with the definition of reimann integral (did i assume too much?)

shoot, i should have added the "i" in the n-(-n) term. thanks for pointing this out:

[tex]\lim_{n \to \infty}\sum_{i=1}^{n}\underbrace{\Big(-n+\frac{n-(-n)}{n}i\Big)^m B_x \Big(-n+\frac{n+(-n)}{n}i\Big)}_{f(x_i)}\underbrace{\frac{n-(-n)}{n}}_{\Delta x}[/tex]

and sure, while automatic let's simplify the above with the n's: [tex]\lim_{n \to \infty}\sum_{i=1}^{n}\underbrace{\Big(2i-n\Big)^m B_x \Big(2i-n\Big)}_{f(x_i)}\underbrace{2}_{\Delta x}[/tex] but what do you recommend now (the tough part)? i think i'll avoid the binomial theorem (too many sums), and the [itex]B[/itex] function seems onerous still (and arcane). have i left something unclear?
 
I still don't get how the xi can represent random variables in the first sum.
Don't you need some reference to expected values to match it up with a numerical LHS?
And I fail to see why the limit of a sum of random variables should be related to a Riemann integral being a limit of a sum.
The moment generating function should be useful.
 
joshmccraney said:
the original is here if you want to take a look: http://www.turbulence-online.com/Publications/Lecture_Notes/Turbulence_Lille/TB_16January2013.pdf top of page 26. it's confusing to me so i thought id run it by you guys but if you're stumped then i'll call it.

let me know if you see something i missed but if not, thanks for trying!

I think the material you cite is nonsense, at least as it is written. It can be rescued, however. If ##X_1, X_2, \ldots ## is a sample of independent, identically-distributed random variables (basically, random "copies" ) and their common pdf is ##B(.)##, then we almost-surely have
[tex]EX^m \equiv \int_{-\infty}^{\infty} x^m B(x) \, dx = <br /> \lim_{n \to \infty} \frac{1}{n} \sum_{i=1}^{n} X_i^m[/tex]
This is just the (strong) Law of Large Numbers applied to the random variables ##Z_i = X_i^m##.

Note: in the above, the left-hand-side is a non-random number, while the right-hand-side is a sample of some random variable. The strong law of large numbers says (among other things) that the sample random variable on the right is equal to the number on the left except for some event having probability zero. Note also that the x in the integration is just a dummy variable of integration having nothing to do with the X on the right; you could call it c instead, as you did.
 
i'm glad you're familiar with it (at least, after you make corrections). how would i begin proving the statement you've written (not the definition, just the equality)? also, it bothers me that I am unfamiliar and confused with this statement and how it is necessary:

Ray Vickson said:
If ##X_1, X_2, \ldots ## is a sample of independent, identically-distributed random variables (basically, random "copies" ) and their common pdf is ##B(.)##, then we almost-surely have
[tex]EX^m \equiv \int_{-\infty}^{\infty} x^m B(x) \, dx = <br /> \lim_{n \to \infty} \frac{1}{n} \sum_{i=1}^{n} X_i^m[/tex]

i feel i should understand this how you do, but i don't. can you elaborate or direct me to a source?

thanks!
 
joshmccraney said:
i'm glad you're familiar with it (at least, after you make corrections). how would i begin proving the statement you've written (not the definition, just the equality)? also, it bothers me that I am unfamiliar and confused with this statement and how it is necessary:



i feel i should understand this how you do, but i don't. can you elaborate or direct me to a source?

thanks!

Google "Law of Large Numbers".
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K