Relation between exponentially distributed random variables and Poisson(1)

AI Thread Summary
The discussion revolves around the relationship between independent exponentially distributed random variables and the Poisson distribution. Specifically, it examines the behavior of the count N_n of these variables that exceed a certain threshold as n approaches infinity, suggesting that N_n converges in distribution to a Poisson(1) random variable. Participants clarify that N_n can be expressed as a sum of indicator functions, leading to its characterization as a binomial distribution. The conversation emphasizes the importance of determining the probability parameter for the binomial distribution derived from the indicator functions. Overall, the thread provides insights into the mathematical underpinnings of this convergence and the relevant probability distributions involved.
rukawakaede
Messages
58
Reaction score
0
Hi,

Suppose X_1, X_2,\cdots be an independent and identically distributed sequence of exponentially distributed random variables with parameter 1.

Now Let N_n:=\#\{1\leq k\leq n:X_k\geq \log(n)\}

I was told that N_n\xrightarrow{\mathcal{D}}Y where Y\simPoisson(1).

Could anyone give me some directions for this problem? I am totally out of mind.

Thanks.
 
Physics news on Phys.org
For fixed n, the condition
X_k = log ( n )
gives you a probability for each of the exponential random variables to be included in the count. So, what distribution does that mean N has in terms of n, k and that probability?

The limit of that distribution as n goes to infinity should then be Poisson.
 
Hi rukawakaede! :smile:

Like sfs01 suggested, you first need to calculate the distribution of Nn. That is, you'll need to know probabilities like P\{N_n=k\}. Let's get you started on that calculation: (assume that 0\leq k\leq n)

P\{N_n=k\}=P\left(\bigcup_{A\subseteq \{1,...,n\},~|A|=k}\{X_i\geq \log(n),i\in A; X_i<\log(n), i\notin A\}\right)

Now try to calculate that using that the Xi are iid.
 
rukawakaede said:
Hi,

Suppose X_1, X_2,\cdots be an independent and identically distributed sequence of exponentially distributed random variables with parameter 1.

Now Let N_n:=\#\{1\leq k\leq n:X_k\geq \log(n)\}

I was told that N_n\xrightarrow{\mathcal{D}}Y where Y\simPoisson(1).

Could anyone give me some directions for this problem? I am totally out of mind.

Thanks.

If you write it as N_n=\sum_{k=1}^nI(X_k \geq \log(n)) it should be easy to show that N_n is binomial with parameter 1/n. To show convergence I'd use characteristic functions.
 
bpet said:
If you write it as N_n=\sum_{k=1}^nI(X_k \geq \log(n)) it should be easy to show that N_n is binomial with parameter 1/n. To show convergence I'd use characteristic functions.

May I know what your I means?
 
micromass said:
Hi rukawakaede! :smile:

Like sfs01 suggested, you first need to calculate the distribution of Nn. That is, you'll need to know probabilities like P\{N_n=k\}. Let's get you started on that calculation: (assume that 0\leq k\leq n)

P\{N_n=k\}=P\left(\bigcup_{A\subseteq \{1,...,n\},~|A|=k}\{X_i\geq \log(n),i\in A; X_i<\log(n), i\notin A\}\right)

Now try to calculate that using that the Xi are iid.

Could you please tell me more about the N_n?
 
Uh, well, I think bpet's method is a bit easier than what I had in mind. So I suggest following him.

Anyway, the I there means a characteristic function, thus

N_n=\sum_{k=1}^n{I_{X_k^{-1}[\log(n),+\infty[}}

where

I_A(\omega)=\left\{\begin{array}{c}0~\text{if}~\omega\notin A\\ 1~\text{if}~\omega\in A\\ \end{array}\right.
 
micromass said:
Uh, well, I think bpet's method is a bit easier than what I had in mind. So I suggest following him.

Anyway, the I there means a characteristic function, thus

N_n=\sum_{k=1}^n{I_{X_k^{-1}[\log(n),+\infty[}}

where

I_A(\omega)=\left\{\begin{array}{c}0~\text{if}~\omega\notin A\\ 1~\text{if}~\omega\in A\\ \end{array}\right.

If in that case, isn't the I an indicator function?

I still can't see why N_n is a binomial distribution.
 
rukawakaede said:
If in that case, isn't the I an indicator function?

Yes.

I still can't see why N_n is a binomial distribution.
[/QUOTE]

Because every indicator function IA has a Bernouilli distribution. Indeed, it becomes 0 with chance P(Ac) and it becomes 1 with chance P(A). So it is Bernouilli.

Now, the sum of Bernouilli functions is a binomial function, i.e. Bern(p)+...+Bern(p)=Bin(n,p).
Also note that Bern(p)=Bin(1,p).
All you have to do now is figure out what p is here...
 
  • #10
micromass said:
Yes.Because every indicator function IA has a Bernouilli distribution. Indeed, it becomes 0 with chance P(Ac) and it becomes 1 with chance P(A). So it is Bernouilli.

Now, the sum of Bernouilli functions is a binomial function, i.e. Bern(p)+...+Bern(p)=Bin(n,p).
Also note that Bern(p)=Bin(1,p).
All you have to do now is figure out what p is here...

is the p=(1/2)n since each indicator function can be 1 or 0?
 
  • #11
No, the p of IA is P(A), since IA has a chance of P(A) of becoming 1.
Take for example throwing a weighted coin such that the chance on head is 1/3. Then it is Bernouilli(1/3) distributed. But by your reasoning, it would be Bernouilli(1/2) since the coin has two sides. It's not always so that the probability is simple 1/(number of outcomes)
 
  • #12
micromass said:
No, the p of IA is P(A), since IA has a chance of P(A) of becoming 1.
Take for example throwing a weighted coin such that the chance on head is 1/3. Then it is Bernouilli(1/3) distributed. But by your reasoning, it would be Bernouilli(1/2) since the coin has two sides. It's not always so that the probability is simple 1/(number of outcomes)

Hi I guess I misunderstood the meaning for p. Could you please explain again what actually p is?
 
Last edited:
  • #13
Well, a Bernouilli(p) variable is a random variable X such that X takes on two values 0 and 1, and the chance that X takes on 1 is p, and the chance that X takes on 0 is 1-p.
 
Back
Top