Are max and min of n iid r.v.s. independent?

  • Thread starter Thread starter maverick280857
  • Start date Start date
  • Tags Tags
    Independent Max
Click For Summary
SUMMARY

The maximum and minimum of a sequence of i.i.d. random variables, denoted as X_{(n)} and X_{(1)} respectively, are not independent. This conclusion is established by analyzing the joint probability distribution functions (pdf) and demonstrating that the cumulative distribution function (cdf) does not factorize. Specifically, the probability P(m PREREQUISITES

  • Understanding of i.i.d. random variables
  • Knowledge of joint probability distribution functions (pdf)
  • Familiarity with cumulative distribution functions (cdf)
  • Concept of order statistics in probability theory
NEXT STEPS
  • Study the properties of order statistics in probability theory
  • Learn about joint and marginal probability distribution functions
  • Explore examples of non-independent random variables
  • Review statistical texts that cover distributions of order statistics
USEFUL FOR

Statisticians, mathematicians, and students studying probability theory, particularly those interested in the behavior of order statistics and their implications in statistical analysis.

maverick280857
Messages
1,774
Reaction score
5
Hi

Suppose X_{1}, \ldots, X_{n} is a sequence of i.i.d. random variables. We define

X_{(n)} = max(X_{1}, \ldots, X_{n})
X_{(1)} = min(X_{1}, \ldots, X_{n})

Are X_{(n)} and X_{(1)} independent?

Whats the best/easiest way to verify this?

Thanks
Vivek
 
Physics news on Phys.org
They are not independent. The maximum is always larger than the minimum ...
 
Yeah, nice observation. Thanks :smile:
 
Suppose I wanted to show it using the factorization of the joint pdf or joint pmf, how would I do that?
 
You just have to find one example such that the cdf does not factorize.

Let m be the minimum, M the maximum, x some real number

What about P(m<x && M<x)

This is equal to only M being less than x ( because then m is automatically also less than x.

so P(m<x && M<x) = P(M<x)

For this to be equal to the factorized probability P(m<x)P(M<x) you need to have P(m<x)=1 for all real x ...which is not true:smile:
 
Thanks Pere :smile:
 
Look for a statistics text that discusses the distributions of order statistics and sets of order statistics. You will be able to find a general formula for the p.d.f of the \min \text{ and } \max in terms of the marginal pdfs and joint pdf of the sample. Once you see that form, you will see that they need not be independent.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 35 ·
2
Replies
35
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K