Relation between exponentially distributed random variables and Poisson(1)

Click For Summary

Discussion Overview

The discussion centers on the relationship between independent and identically distributed exponentially distributed random variables and the Poisson distribution, specifically exploring the behavior of the random variable \( N_n \) defined as the count of these variables exceeding a logarithmic threshold. The participants are investigating the convergence of \( N_n \) to a Poisson distribution as \( n \) approaches infinity, involving theoretical and mathematical reasoning.

Discussion Character

  • Exploratory
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • Some participants propose that \( N_n \) can be expressed as a sum of indicator functions, leading to a binomial distribution with a certain parameter.
  • Others argue that the convergence of \( N_n \) to a Poisson distribution requires understanding the distribution of \( N_n \) and calculating probabilities like \( P\{N_n=k\} \).
  • A participant mentions that the indicator function \( I_A \) represents a Bernoulli distribution, suggesting that the sum of these functions results in a binomial distribution.
  • There is a discussion about the parameter \( p \) in the Bernoulli distribution, with some confusion regarding its definition and calculation.
  • One participant questions the reasoning behind the binomial distribution characterization of \( N_n \) and seeks clarification on the parameter \( p \).

Areas of Agreement / Disagreement

Participants express differing views on the characterization of \( N_n \) as a binomial distribution and the definition of the parameter \( p \). The discussion remains unresolved, with multiple competing interpretations and approaches presented.

Contextual Notes

Limitations in understanding the distribution of \( N_n \) and the calculation of probabilities are evident, as participants grapple with the implications of the indicator functions and their associated probabilities.

rukawakaede
Messages
58
Reaction score
0
Hi,

Suppose X_1, X_2,\cdots be an independent and identically distributed sequence of exponentially distributed random variables with parameter 1.

Now Let N_n:=\#\{1\leq k\leq n:X_k\geq \log(n)\}

I was told that N_n\xrightarrow{\mathcal{D}}Y where Y\simPoisson(1).

Could anyone give me some directions for this problem? I am totally out of mind.

Thanks.
 
Physics news on Phys.org
For fixed n, the condition
X_k = log ( n )
gives you a probability for each of the exponential random variables to be included in the count. So, what distribution does that mean N has in terms of n, k and that probability?

The limit of that distribution as n goes to infinity should then be Poisson.
 
Hi rukawakaede! :smile:

Like sfs01 suggested, you first need to calculate the distribution of Nn. That is, you'll need to know probabilities like P\{N_n=k\}. Let's get you started on that calculation: (assume that 0\leq k\leq n)

P\{N_n=k\}=P\left(\bigcup_{A\subseteq \{1,...,n\},~|A|=k}\{X_i\geq \log(n),i\in A; X_i<\log(n), i\notin A\}\right)

Now try to calculate that using that the Xi are iid.
 
rukawakaede said:
Hi,

Suppose X_1, X_2,\cdots be an independent and identically distributed sequence of exponentially distributed random variables with parameter 1.

Now Let N_n:=\#\{1\leq k\leq n:X_k\geq \log(n)\}

I was told that N_n\xrightarrow{\mathcal{D}}Y where Y\simPoisson(1).

Could anyone give me some directions for this problem? I am totally out of mind.

Thanks.

If you write it as N_n=\sum_{k=1}^nI(X_k \geq \log(n)) it should be easy to show that N_n is binomial with parameter 1/n. To show convergence I'd use characteristic functions.
 
bpet said:
If you write it as N_n=\sum_{k=1}^nI(X_k \geq \log(n)) it should be easy to show that N_n is binomial with parameter 1/n. To show convergence I'd use characteristic functions.

May I know what your I means?
 
micromass said:
Hi rukawakaede! :smile:

Like sfs01 suggested, you first need to calculate the distribution of Nn. That is, you'll need to know probabilities like P\{N_n=k\}. Let's get you started on that calculation: (assume that 0\leq k\leq n)

P\{N_n=k\}=P\left(\bigcup_{A\subseteq \{1,...,n\},~|A|=k}\{X_i\geq \log(n),i\in A; X_i<\log(n), i\notin A\}\right)

Now try to calculate that using that the Xi are iid.

Could you please tell me more about the N_n?
 
Uh, well, I think bpet's method is a bit easier than what I had in mind. So I suggest following him.

Anyway, the I there means a characteristic function, thus

N_n=\sum_{k=1}^n{I_{X_k^{-1}[\log(n),+\infty[}}

where

I_A(\omega)=\left\{\begin{array}{c}0~\text{if}~\omega\notin A\\ 1~\text{if}~\omega\in A\\ \end{array}\right.
 
micromass said:
Uh, well, I think bpet's method is a bit easier than what I had in mind. So I suggest following him.

Anyway, the I there means a characteristic function, thus

N_n=\sum_{k=1}^n{I_{X_k^{-1}[\log(n),+\infty[}}

where

I_A(\omega)=\left\{\begin{array}{c}0~\text{if}~\omega\notin A\\ 1~\text{if}~\omega\in A\\ \end{array}\right.

If in that case, isn't the I an indicator function?

I still can't see why N_n is a binomial distribution.
 
rukawakaede said:
If in that case, isn't the I an indicator function?

Yes.

I still can't see why N_n is a binomial distribution.
[/QUOTE]

Because every indicator function IA has a Bernouilli distribution. Indeed, it becomes 0 with chance P(Ac) and it becomes 1 with chance P(A). So it is Bernouilli.

Now, the sum of Bernouilli functions is a binomial function, i.e. Bern(p)+...+Bern(p)=Bin(n,p).
Also note that Bern(p)=Bin(1,p).
All you have to do now is figure out what p is here...
 
  • #10
micromass said:
Yes.Because every indicator function IA has a Bernouilli distribution. Indeed, it becomes 0 with chance P(Ac) and it becomes 1 with chance P(A). So it is Bernouilli.

Now, the sum of Bernouilli functions is a binomial function, i.e. Bern(p)+...+Bern(p)=Bin(n,p).
Also note that Bern(p)=Bin(1,p).
All you have to do now is figure out what p is here...

is the p=(1/2)n since each indicator function can be 1 or 0?
 
  • #11
No, the p of IA is P(A), since IA has a chance of P(A) of becoming 1.
Take for example throwing a weighted coin such that the chance on head is 1/3. Then it is Bernouilli(1/3) distributed. But by your reasoning, it would be Bernouilli(1/2) since the coin has two sides. It's not always so that the probability is simple 1/(number of outcomes)
 
  • #12
micromass said:
No, the p of IA is P(A), since IA has a chance of P(A) of becoming 1.
Take for example throwing a weighted coin such that the chance on head is 1/3. Then it is Bernouilli(1/3) distributed. But by your reasoning, it would be Bernouilli(1/2) since the coin has two sides. It's not always so that the probability is simple 1/(number of outcomes)

Hi I guess I misunderstood the meaning for p. Could you please explain again what actually p is?
 
Last edited:
  • #13
Well, a Bernouilli(p) variable is a random variable X such that X takes on two values 0 and 1, and the chance that X takes on 1 is p, and the chance that X takes on 0 is 1-p.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K