JohanL said:
Homework Statement
Insects land in the soup in the manner of a Poisson process with intensity lambda. Insects are green with probability p, independent of the color of the other insects. Show that the arrival of green insects is a Poisson process with intensity p*lambda.
Homework Equations
3. The solution
Im only interested in one step of the solution. Our professor started out calculating the characteristic function of an exponential random variable.
"We solve this by checking that the times, call one such typical time T, between arrivals of new green insects are exp(p*lambda) - distributed"
$$E[e^{j\omega\exp(\lambda)}] = ... = \frac{\lambda}{\lambda - j\omega}$$
Its the next step that is confusing me
$$E[e^{j\omega\exp(T)}] = \sum_{n=1}^\infty (\frac{\lambda}{\lambda - j\omega})^n p(1-p)^n$$
What is he doing here?
I understand that he is dividing the event into a disjoint sum of events and that the characteristic function of a sum of independent variables is the product of the characteristic functions of the variables but still don't understand what the nth power of a characteristic function is doing there. Is the characteristic function of an exponential a waiting time??
If ##X_1, X_2, \ldots## are independent, exponentially-distributed random variables with rate parameter ##\lambda##, the time (##T##) to the first green arrival is
[tex]T = \sum_{i=1}^N X_i,[/tex]
where ##N## is a geometric-distributed random variable with parameter ##p##. Here, ##N## and all the ##X_i## are independent. Note that ##P(N = n) = p (1-p)^{n-1}, \; n = 1,2,3, \ldots##. In more detail:
[tex]\begin{array}{lcl}T = X_1& \text{if}& N = 1\\<br />
T = X_1 + X_2 &\text{if}& N = 2 \\<br />
T = X_1 + X_2 + X_3 &\text{if}& N = 3\\<br />
\hspace{3em}\vdots & \vdots &\hspace{1em} \vdots<br />
\end{array}[/tex]
We have that ##N = 1## if the first arrival is green, ##N = 2## if the first arrival is non-green and the second is green, ##N = 3## if the first two arrivals are non-green and the third is green, etc.The characteristic function of ##T## is ##C_T(\omega) = E e^{j \omega T}##. We have
[tex]\begin{array}{rcl}E e^{j \omega T}& =& \sum_{n=1}^{\infty} P(N = n) E \left[ e^{j \omega T} | N = n \right] \\<br />
<br />
&=& \sum_{n=1}^{\infty} P(N = n) E \left[ e^{j \omega (X_1 + X_2 + \cdots + X_n)} | N = n\right] \\<br />
&= & \sum_{n=1}^{\infty} P(N = n) E \left[ e^{j \omega (X_1 + X_2 + \cdots + X_n)} \right]<br />
\end{array}[/tex]
This last step holds because, for each given ##n##, the random variables ##X_1, X_2, \ldots ## are independent of the event ##\{N = n \}##.
Since you say you know that the ch. fcn. of a sum of independent r.v.s is the product of their ch. fcns., you should know that
[tex]E \left[ e^{j \omega (X_1 + X_2 + \cdots + X_n)} \right] = \left( E \,e^{ j \omega X} \right)^n,[/tex]
where ##X## is anyone of the identically-distributed ##X_i##.
You end up with
[tex]E\,e^{j \omega T} = \sum_{n=1}^{\infty} p (1-p)^{n-1} \left( \frac{\lambda}{\lambda - j \omega} \right)^n[/tex]
This is the formula you want, except it has the correct ##(1-p)^{n-1}## instead of the incorrect ##(1-p)^n## in the sum.