I How to show these random variables are independent?

  • I
  • Thread starter Thread starter psie
  • Start date Start date
  • Tags Tags
    Probability theory
psie
Messages
315
Reaction score
40
TL;DR Summary
I am studying order statistics in An Intermediate Course in Probability by Gut. First the author treats only continuous distributions. In a section on the joint distribution of the extreme order variables ##X_{(n)}=\max\{X_1,\ldots,X_n\}## and ##X_{(1)}=\min\{X_1,\ldots,X_n\}##, the author derives the density of the range. that is ##R_n=X_{(n)}-X_{(1)}##. Then there's an exercise which I simply do not understand why it's in that section.
The exercise that appears in the text is:

Exercise 2.5 The geometric distribution is a discrete analog of the exponential distribution in the sense of lack of memory. More precisely, show that if ##X_1## and ##X_2## are independent ##\text{Ge}(p)##-distributed random variables, then ##X_{(1)}## and ##X_{(2)}-X_{(1)}## are independent.

What I find confusing about this exercise is that the author has, up until now, not derived any results for order statistics when it comes to discrete distributions. I know the formula for the density of ##X_{(1)}## and the range when the underlying distribution is continuous, but these do not apply for discrete distribution. I was thinking going back to an earlier chapter where the author derives distributions of transformations of random variables. I was thinking I could assume ##X_2## to be greater than ##X_1## and then compute the pmf of their difference, but this doesn't feel like a sensible assumption, since after all, ##\max\{X_1,\ldots,X_n\}## is understood pointwise.

How would you go about solving this exercise?
 
Physics news on Phys.org
This has been solved. It's a bit of work writing it all down, but basically you want to compute the probabilities ## P\left(X_{(1)}=u\right)##, ##P\left(X_{(1)}=u, X_{(2)}=u+d \right)## and ##P\left(X_{(2)}-X_{(1)}=d \right)##. The first two are fairly straightforward, splitting up the probability using indicator functions. The third probability is a bit more tricky and conditional expectation will come in handy (in addition to indicator functions). In particular, the identity $$P(A)=E[\mathbf1_A]=E[E[\mathbf1_A\mid X]]=E[P(A\mid X)].$$
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...

Similar threads

Replies
8
Views
1K
Replies
5
Views
2K
Replies
11
Views
3K
Replies
1
Views
1K
Replies
9
Views
2K
Back
Top