Relation between exponentially distributed random variables and Poisson(1)

In summary: So, if you have an indicator function IA, it is either 0 or 1 with chance 1-p or p respectively. That means it is Bernouilli(p).
  • #1
rukawakaede
59
0
Hi,

Suppose [itex]X_1, X_2,\cdots[/itex] be an independent and identically distributed sequence of exponentially distributed random variables with parameter 1.

Now Let [itex]N_n:=\#\{1\leq k\leq n:X_k\geq \log(n)\}[/itex]

I was told that [itex]N_n\xrightarrow{\mathcal{D}}Y[/itex] where [itex]Y\sim [/itex]Poisson(1).

Could anyone give me some directions for this problem? I am totally out of mind.

Thanks.
 
Physics news on Phys.org
  • #2
For fixed n, the condition
[tex]X_k = log ( n )[/tex]
gives you a probability for each of the exponential random variables to be included in the count. So, what distribution does that mean N has in terms of n, k and that probability?

The limit of that distribution as n goes to infinity should then be Poisson.
 
  • #3
Hi rukawakaede! :smile:

Like sfs01 suggested, you first need to calculate the distribution of Nn. That is, you'll need to know probabilities like [itex]P\{N_n=k\}[/itex]. Let's get you started on that calculation: (assume that [itex]0\leq k\leq n[/itex])

[tex]P\{N_n=k\}=P\left(\bigcup_{A\subseteq \{1,...,n\},~|A|=k}\{X_i\geq \log(n),i\in A; X_i<\log(n), i\notin A\}\right)[/tex]

Now try to calculate that using that the Xi are iid.
 
  • #4
rukawakaede said:
Hi,

Suppose [itex]X_1, X_2,\cdots[/itex] be an independent and identically distributed sequence of exponentially distributed random variables with parameter 1.

Now Let [itex]N_n:=\#\{1\leq k\leq n:X_k\geq \log(n)\}[/itex]

I was told that [itex]N_n\xrightarrow{\mathcal{D}}Y[/itex] where [itex]Y\sim [/itex]Poisson(1).

Could anyone give me some directions for this problem? I am totally out of mind.

Thanks.

If you write it as [itex]N_n=\sum_{k=1}^nI(X_k \geq \log(n))[/itex] it should be easy to show that [itex]N_n[/itex] is binomial with parameter 1/n. To show convergence I'd use characteristic functions.
 
  • #5
bpet said:
If you write it as [itex]N_n=\sum_{k=1}^nI(X_k \geq \log(n))[/itex] it should be easy to show that [itex]N_n[/itex] is binomial with parameter 1/n. To show convergence I'd use characteristic functions.

May I know what your [itex]I[/itex] means?
 
  • #6
micromass said:
Hi rukawakaede! :smile:

Like sfs01 suggested, you first need to calculate the distribution of Nn. That is, you'll need to know probabilities like [itex]P\{N_n=k\}[/itex]. Let's get you started on that calculation: (assume that [itex]0\leq k\leq n[/itex])

[tex]P\{N_n=k\}=P\left(\bigcup_{A\subseteq \{1,...,n\},~|A|=k}\{X_i\geq \log(n),i\in A; X_i<\log(n), i\notin A\}\right)[/tex]

Now try to calculate that using that the Xi are iid.

Could you please tell me more about the [itex]N_n[/itex]?
 
  • #7
Uh, well, I think bpet's method is a bit easier than what I had in mind. So I suggest following him.

Anyway, the I there means a characteristic function, thus

[tex]N_n=\sum_{k=1}^n{I_{X_k^{-1}[\log(n),+\infty[}}[/tex]

where

[tex]I_A(\omega)=\left\{\begin{array}{c}0~\text{if}~\omega\notin A\\ 1~\text{if}~\omega\in A\\ \end{array}\right.[/tex]
 
  • #8
micromass said:
Uh, well, I think bpet's method is a bit easier than what I had in mind. So I suggest following him.

Anyway, the I there means a characteristic function, thus

[tex]N_n=\sum_{k=1}^n{I_{X_k^{-1}[\log(n),+\infty[}}[/tex]

where

[tex]I_A(\omega)=\left\{\begin{array}{c}0~\text{if}~\omega\notin A\\ 1~\text{if}~\omega\in A\\ \end{array}\right.[/tex]

If in that case, isn't the [itex]I[/itex] an indicator function?

I still can't see why [itex]N_n[/itex] is a binomial distribution.
 
  • #9
rukawakaede said:
If in that case, isn't the [itex]I[/itex] an indicator function?

Yes.

I still can't see why [itex]N_n[/itex] is a binomial distribution.
[/QUOTE]

Because every indicator function IA has a Bernouilli distribution. Indeed, it becomes 0 with chance P(Ac) and it becomes 1 with chance P(A). So it is Bernouilli.

Now, the sum of Bernouilli functions is a binomial function, i.e. Bern(p)+...+Bern(p)=Bin(n,p).
Also note that Bern(p)=Bin(1,p).
All you have to do now is figure out what p is here...
 
  • #10
micromass said:
Yes.Because every indicator function IA has a Bernouilli distribution. Indeed, it becomes 0 with chance P(Ac) and it becomes 1 with chance P(A). So it is Bernouilli.

Now, the sum of Bernouilli functions is a binomial function, i.e. Bern(p)+...+Bern(p)=Bin(n,p).
Also note that Bern(p)=Bin(1,p).
All you have to do now is figure out what p is here...

is the p=(1/2)n since each indicator function can be 1 or 0?
 
  • #11
No, the p of IA is P(A), since IA has a chance of P(A) of becoming 1.
Take for example throwing a weighted coin such that the chance on head is 1/3. Then it is Bernouilli(1/3) distributed. But by your reasoning, it would be Bernouilli(1/2) since the coin has two sides. It's not always so that the probability is simple 1/(number of outcomes)
 
  • #12
micromass said:
No, the p of IA is P(A), since IA has a chance of P(A) of becoming 1.
Take for example throwing a weighted coin such that the chance on head is 1/3. Then it is Bernouilli(1/3) distributed. But by your reasoning, it would be Bernouilli(1/2) since the coin has two sides. It's not always so that the probability is simple 1/(number of outcomes)

Hi I guess I misunderstood the meaning for p. Could you please explain again what actually p is?
 
Last edited:
  • #13
Well, a Bernouilli(p) variable is a random variable X such that X takes on two values 0 and 1, and the chance that X takes on 1 is p, and the chance that X takes on 0 is 1-p.
 

1. What is the relationship between exponentially distributed random variables and Poisson(1)?

The relationship between exponentially distributed random variables and Poisson(1) is that the Poisson(1) distribution can be thought of as a special case of the exponential distribution. In other words, a Poisson(1) distribution is a type of exponential distribution with a mean of 1. This means that the two distributions share similar characteristics and can be used interchangeably in certain situations.

2. How are the parameters of an exponential distribution related to those of a Poisson(1) distribution?

The parameter of an exponential distribution, often denoted as λ (lambda), is equal to the mean of the corresponding Poisson(1) distribution. In other words, if X is an exponentially distributed random variable with parameter λ, then the corresponding Poisson(1) distribution has a mean of λ.

3. Can you use an exponentially distributed random variable to approximate a Poisson(1) distribution?

Yes, you can use an exponentially distributed random variable to approximate a Poisson(1) distribution under certain conditions. Specifically, if the rate parameter λ of the exponential distribution is small (i.e. close to 0), then the exponential distribution will closely approximate the Poisson(1) distribution.

4. What is the difference between an exponential distribution and a Poisson(1) distribution?

While the two distributions are related, there are some key differences between them. An exponential distribution is continuous, meaning that it can take on any value within a range. On the other hand, a Poisson(1) distribution is discrete, meaning that it can only take on integer values. Additionally, the exponential distribution describes the time between events occurring, while the Poisson(1) distribution describes the number of events occurring within a fixed interval of time.

5. How are exponentially distributed random variables and Poisson(1) distributions used in real-world applications?

Exponentially distributed random variables and Poisson(1) distributions are commonly used in various fields, such as engineering, finance, and healthcare, to model and analyze the occurrence of events over time. For example, they can be used to model the arrival times of customers at a store, the time between machine failures in a factory, or the number of patients arriving at a hospital in a given hour. By understanding the relationship between these two distributions, scientists and researchers can make more accurate predictions and decisions based on their data.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
441
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
824
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
Replies
0
Views
327
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
874
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
796
Back
Top